did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9780387212395

Monte Carlo Statistical Methods

by ;
  • ISBN13:

    9780387212395

  • ISBN10:

    0387212396

  • Edition: 2nd
  • Format: Hardcover
  • Copyright: 2004-10-30
  • Publisher: Springer Nature
  • Purchase Benefits
  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $119.99 Save up to $79.83
  • Digital
    $87.02
    Add to Cart

    DURATION
    PRICE

Supplemental Materials

What is included with this book?

Summary

Monte Carlo statistical methods, particularly those based on Markov chains, are now an essential component of the standard set of techniques used by statisticians. This new edition has been revised towards a coherent and flowing coverage of these simulation techniques, with incorporation of the most recent developments in the field. In particular, the introductory coverage of random variable generation has been totally revised, with many concepts being unified through a fundamental theorem of simulation There are five completely new chapters that cover Monte Carlo control, reversible jump, slice sampling, sequential Monte Carlo, and perfect sampling. There is a more in-depth coverage of Gibbs sampling, which is now contained in three consecutive chapters. The development of Gibbs sampling starts with slice sampling and its connection with the fundamental theorem of simulation, and builds up to two-stage Gibbs sampling and its theoretical properties. A third chapter covers the multi-stage Gibbs sampler and its variety of applications. Lastly, chapters from the previous edition have been revised towards easier access, with the examples getting more detailed coverage. This textbook is intended for a second year graduate course, but will also be useful to someone who either wants to apply simulation techniques for the resolution of practical problems or wishes to grasp the fundamental principles behind those methods. The authors do not assume familiarity with Monte Carlo techniques (such as random variable generation), with computer programming, or with any Markov chain theory (the necessary concepts are developed in Chapter 6). A solutions manual, which covers approximately 40% of the problems, is available for instructors who require the book for a course. Christian P. Robert is Professor of Statistics in the Applied Mathematics Department at Universit?? Paris Dauphine, France. He is also Head of the Statistics Laboratory at the Center for Research in Economics and Statistics (CREST) of the National Institute for Statistics and Economic Studies (INSEE) in Paris, and Adjunct Professor at Ecole Polytechnique. He has written three other books, including The Bayesian Choice, Second Edition, Springer 2001. He also edited Discretization and MCMC Convergence Assessment, Springer 1998. He has served as associate editor for the Annals of Statistics and the Journal of the American Statistical Association. He is a fellow of the Institute of Mathematical Statistics, and a winner of the Young Statistician Award of the Societi?? de Statistique de Paris in 1995. George Casella is Distinguished Professor and Chair, Department of Statistics, University of Florida. He has served as the Theory and Methods Editor of the Journal of the American Statistical Association and Executive Editor of Statistical Science. He has authored three other textbooks: Statistical Inference, Second Edition, 2001, with Roger L. Berger; Theory of Point Estimation, 1998, with Erich Lehmann; and Variance Components, 1992, with Shayle R. Searle and Charles E. McCulloch. He is a fellow of the Institute of Mathematical Statistics and the American Statistical Association, and an elected fellow of the International Statistical Institute.

Author Biography

Christian P. Robert is Professor of Statistics in the Applied Mathematics Department at Universite Paris Dauphine France.

Table of Contents

Preface to the Second Edition ix
Preface to the First Edition xiii
1 Introduction
1(34)
1.1 Statistical Models
1(4)
1.2 Likelihood Methods
5(7)
1.3 Bayesian Methods
12(7)
1.4 Deterministic Numerical Methods
19(4)
1.4.1 Optimization
19(2)
1.4.2 Integration
21(1)
1.4.3 Comparison
21(2)
1.5 Problems
23(7)
1.6 Notes
30(5)
1.6.1 Prior Distributions
30(2)
1.6.2 Bootstrap Methods
32(3)
2 Random Variable Generation
35(44)
2.1 Introduction
35(7)
2.1.1 Uniform Simulation
36(2)
2.1.2 The Inverse Transform
38(2)
2.1.3 Alternatives
40(1)
2.1.4 Optimal Algorithms
41(1)
2.2 General Transformation Methods
42(5)
2.3 Accept-Reject Methods
47(6)
2.3.1 The Fundamental Theorem of Simulation
47(4)
2.3.2 The Accept-Reject Algorithm
51(2)
2.4 Envelope Accept-Reject Methods
53(9)
2.4.1 The Squeeze Principle
53(3)
2.4.2 Log-Concave Densities
56(6)
2.5 Problems
62(10)
2.6 Notes
72(7)
2.6.1 The Kiss Generator
72(3)
2.6.2 Quasi-Monte Carlo Methods
75(2)
2.6.3 Mixture Representations
77(2)
3 Monte Carlo Integration
79(44)
3.1 Introduction
79(4)
3.2 Classical Monte Carlo Integration
83(7)
3.3 Importance Sampling
90(17)
3.3.1 Principles
90(4)
3.3.2 Finite Variance Estimators
94(9)
3.3.3 Comparing Importance Sampling with Accept-Reject
103(4)
3.4 Laplace Approximations
107(3)
3.5 Problems
110(9)
3.6 Notes
119(4)
3.6.1 Large Deviations Techniques
119(1)
3.6.2 The Saddlepoint Approximation
120(3)
4 Controling Monte Carlo Variance
123(34)
4.1 Monitoring Variation with the CLT
123(7)
4.1.1 Univariate Monitoring
124(4)
4.1.2 Multivariate Monitoring
128(2)
4.2 Rao-Blackwellization
130(4)
4.3 Riemann Approximations
134(6)
4.4 Acceleration Methods
140(7)
4.4.1 Antithetic Variables
140(5)
4.4.2 Control Variates
145(2)
4.5 Problems
147(6)
4.6 Notes
153(4)
4.6.1 Monitoring Importance Sampling Convergence
153(1)
4.6.2 Accept-Reject with Loose Bounds
154(1)
4.6.3 Partitioning
155(2)
5 Monte Carlo Optimization
157(48)
5.1 Introduction
157(2)
5.2 Stochastic Exploration
159(15)
5.2.1 A Basic Solution
159(3)
5.2.2 Gradient Methods
162(1)
5.2.3 Simulated Annealing
163(6)
5.2.4 Prior Feedback
169(5)
5.3 Stochastic Approximation
174(14)
5.3.1 Missing Data Models and Demarginalization
174(2)
5.3.2 The EM Algorithm
176(7)
5.3.3 Monte Carlo EM
183(3)
5.3.4 EM Standard Errors
186(2)
5.4 Problems
188(12)
5.5 Notes
200(5)
5.5.1 Variations on EM
200(1)
5.5.2 Neural Networks
201(1)
5.5.3 The Robbins-Monro procedure
201(2)
5.5.4 Monte Carlo Approximation
203(2)
6 Markov Chains
205(62)
6.1 Essentials for MCMC
206(2)
6.2 Basic Notions
208(5)
6.3 Irreducibility, Atoms, and Small Sets
213(5)
6.3.1 Irreducibility
213(1)
6.3.2 Atoms and Small Sets
214(3)
6.3.3 Cycles and Aperiodicity
217(1)
6.4 Transience and Recurrence
218(5)
6.4.1 Classification of Irreducible Chains
218(3)
6.4.2 Criteria for Recurrence
221(1)
6.4.3 Harris Recurrence
221(2)
6.5 Invariant Measures
223(8)
6.5.1 Stationary Chains
223(1)
6.5.2 Kac's Theorem
224(5)
6.5.3 Reversibility and the Detailed Balance Condition
229(2)
6.6 Ergodicity and Convergence
231(7)
6.6.1 Ergodicity
231(5)
6.6.2 Geometric Convergence
236(1)
6.6.3 Uniform Ergodicity
237(1)
6.7 Limit Theorems
238(9)
6.7.1 Ergodic Theorems
240(2)
6.7.2 Central Limit Theorems
242(5)
6.8 Problems
247(11)
6.9 Notes
258(9)
6.9.1 Drift Conditions
258(4)
6.9.2 Eaton's Admissibility Condition
262(1)
6.9.3 Alternative Convergence Conditions
263(1)
6.9.4 Mixing Conditions and Central Limit Theorems
263(2)
6.9.5 Covariance in Markov Chains
265(2)
7 The Metropolis-Hastings Algorithm
267(54)
7.1 The MCMC Principle
267(2)
7.2 Monte Carlo Methods Based on Markov Chains
269(1)
7.3 The Metropolis-Hastings algorithm
270(6)
7.3.1 Definition
270(2)
7.3.2 Convergence Properties
272(4)
7.4 The Independent Metropolis-Hastings Algorithm
276(11)
7.4.1 Fixed Proposals
276(9)
7.4.2 A Metropolis-Hastings Version of ARS
285(2)
7.5 Random Walks
287(5)
7.6 Optimization and Control
292(10)
7.6.1 Optimizing the Acceptance Rate
292(3)
7.6.2 Conditioning and Accelerations
295(4)
7.6.3 Adaptive Schemes
299(3)
7.7 Problems
302(11)
7.8 Notes
313(8)
7.8.1 Background of the Metropolis Algorithm
313(2)
7.8.2 Geometric Convergence of Metropolis-Hastings Algorithms
315(1)
7.8.3 A Reinterpretation of Simulated Annealing
315(1)
7.8.4 Reference Acceptance Rates
316(2)
7.8.5 Langevin Algorithms
318(3)
8 The Slice Sampler
321(16)
8.1 Another Look at the Fundamental Theorem
321(5)
8.2 The General Slice Sampler
326(3)
8.3 Convergence Properties of the Slice Sampler
329(4)
8.4 Problems
333(2)
8.5 Notes
335(2)
8.5.1 Dealing with Difficult Slices
335(2)
9 The Two-Stage Gibbs Sampler
337(34)
9.1 A General Class of Two-Stage Algorithms
337(7)
9.1.1 From Slice Sampling to Gibbs Sampling
337(2)
9.1.2 Definition
339(4)
9.1.3 Back to the Slice Sampler
343(1)
9.1.4 The Hammersley-Clifford Theorem
343(1)
9.2 Fundamental Properties
344(10)
9.2.1 Probabilistic Structures
344(5)
9.2.2 Reversible and Interleaving Chains
349(2)
9.2.3 The Duality Principle
351(3)
9.3 Monotone Covariance and Rao-Blackwellization
354(3)
9.4 The EM-Gibbs Connection
357(3)
9.5 Transition
360(1)
9.6 Problems
360(6)
9.7 Notes
366(5)
9.7.1 Inference for Mixtures
366(2)
9.7.2 ARCH Models
368(3)
10 The Multi-Stage Gibbs Sampler 371(54)
10.1 Basic Derivations
371(7)
10.1.1 Definition
371(2)
10.1.2 Completion
373(3)
10.1.3 The General Hammersley-Clifford Theorem
376(2)
10.2 Theoretical Justifications
378(9)
10.2.1 Markov Properties of the Gibbs Sampler
378(3)
10.2.2 Gibbs Sampling as Metropolis-Hastings
381(2)
10.2.3 Hierarchical Structures
383(4)
10.3 Hybrid Gibbs Samplers
387(9)
10.3.1 Comparison with Metropolis-Hastings Algorithms
387(1)
10.3.2 Mixtures and Cycles
388(4)
10.3.3 Metropolizing the Gibbs Sampler
392(4)
10.4 Statistical Considerations
396(11)
10.4.1 Reparameterization
396(6)
10.4.2 Rao-Blackwellization
402(1)
10.4.3 Improper Priors
403(4)
10.5 Problems
407(12)
10.6 Notes
419(6)
10.6.1 A Bit of Background
419(1)
10.6.2 The BUGS Software
420(1)
10.6.3 Nonparametric Mixtures
420(2)
10.6.4 Graphical Models
422(3)
11 Variable Dimension Models and Reversible Jump Algorithms 425(34)
11.1 Variable Dimension Models
425(4)
11.1.1 Bayesian Model Choice
426(1)
11.1.2 Difficulties in Model Choice
427(2)
11.2 Reversible Jump Algorithms
429(15)
11.2.1 Green's Algorithm
429(3)
11.2.2 A Fixed Dimension Reassessment
432(1)
11.2.3 The Practice of Reversible Jump MCMC
433(11)
11.3 Alternatives to Reversible Jump MCMC
444(5)
11.3.1 Saturation
444(2)
11.3.2 Continuous-Time Jump Processes
446(3)
11.4 Problems
449(9)
11.5 Notes
458(1)
11.5.1 Occam's Razor
458(1)
12 Diagnosing Convergence 459(52)
12.1 Stopping the Chain
459(6)
12.1.1 Convergence Criteria
461(3)
12.1.2 Multiple Chains
464(1)
12.1.3 Monitoring Reconsidered
465(1)
12.2 Monitoring Convergence to the Stationary Distribution
465(15)
12.2.1 A First Illustration
465(1)
12.2.2 Nonparametric Tests of Stationarity
466(4)
12.2.3 Renewal Methods
470(4)
12.2.4 Missing Mass
474(4)
12.2.5 Distance Evaluations
478(2)
12.3 Monitoring Convergence of Averages
480(20)
12.3.1 A First Illustration
480(3)
12.3.2 Multiple Estimates
483(7)
12.3.3 Renewal Theory
490(7)
12.3.4 Within and Between Variances
497(2)
12.3.5 Effective Sample Size
499(1)
12.4 Simultaneous Monitoring
500(4)
12.4.1 Binary Control
500(3)
12.4.2 Valid Discretization
503(1)
12.5 Problems
504(4)
12.6 Notes
508(3)
12.6.1 Spectral Analysis
508(1)
12.6.2 The CODA Software
509(2)
13 Perfect Sampling 511(34)
13.1 Introduction
511(2)
13.2 Coupling from the Past
513(19)
13.2.1 Random Mappings and Coupling
513(3)
13.2.2 Propp and Wilson's Algorithm
516(2)
13.2.3 Monotonicity and Envelopes
518(5)
13.2.4 Continuous States Spaces
523(3)
13.2.5 Perfect Slice Sampling
526(4)
13.2.6 Perfect Sampling via Automatic Coupling
530(2)
13.3 Forward Coupling
532(3)
13.4 Perfect Sampling in Practice
535(1)
13.5 Problems
536(3)
13.6 Notes
539(6)
13.6.1 History
539(1)
13.6.2 Perfect Sampling and Tempering
540(5)
14 Iterated and Sequential Importance Sampling 545(36)
14.1 Introduction
545(1)
14.2 Generalized Importance Sampling
546(1)
14.3 Particle Systems
547(12)
14.3.1 Sequential Monte Carlo
547(2)
14.3.2 Hidden Markov Models
549(2)
14.3.3 Weight Degeneracy
551(1)
14.3.4 Particle Filters
552(2)
14.3.5 Sampling Strategies
554(2)
14.3.6 Fighting the Degeneracy
556(2)
14.3.7 Convergence of Particle Systems
558(1)
14.4 Population Monte Carlo
559(11)
14.4.1 Sample Simulation
560(1)
14.4.2 General Iterative Importance Sampling
560(2)
14.4.3 Population Monte Carlo
562(1)
14.4.4 An Illustration for the Mixture Model
563(2)
14.4.5 Adaptativity in Sequential Algorithms
565(5)
14.5 Problems
570(7)
14.6 Notes
577(8)
14.6.1 A Brief History of Particle Systems
577(1)
14.6.2 Dynamic Importance Sampling
577(2)
14.6.3 Hidden Markov Models
579(2)
A Probability Distributions 581(4)
B Notation 585(6)
B.1 Mathematical
585(1)
B.2 Probability
586(1)
B.3 Distributions
586(1)
B.4 Markov Chains
587(1)
B.5 Statistics
588(1)
B.6 Algorithms
588(3)
References 591(32)
Index of Names 623(8)
Index of Subjects 631

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program