Available:*
Library | Item Barcode | Call Number | Material Type | Item Category 1 | Status |
---|---|---|---|---|---|
Searching... | 30000010124246 | QA279.5 G35 2006 | Open Access Book | Book | Searching... |
On Order
Summary
Summary
While there have been few theoretical contributions on the Markov Chain Monte Carlo (MCMC) methods in the past decade, current understanding and application of MCMC to the solution of inference problems has increased by leaps and bounds. Incorporating changes in theory and highlighting new applications, Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference, Second Edition presents a concise, accessible, and comprehensive introduction to the methods of this valuable simulation technique. The second edition includes access to an internet site that provides the code, written in R and WinBUGS, used in many of the previously existing and new examples and exercises. More importantly, the self-explanatory nature of the codes will enable modification of the inputs to the codes and variation on many directions will be available for further exploration.
Major changes from the previous edition:
·nbsp;nbsp;nbsp;nbsp;nbsp;nbsp;nbsp;nbsp; More examples with discussion of computational details in chapters on Gibbs sampling and Metropolis-Hastings algorithms
·nbsp;nbsp;nbsp;nbsp;nbsp;nbsp;nbsp;nbsp; Recent developments in MCMC, including reversible jump, slice sampling, bridge sampling, path sampling, multiple-try, and delayed rejection
·nbsp;nbsp;nbsp;nbsp;nbsp;nbsp;nbsp;nbsp; Discussion of computation using both R and WinBUGS
·nbsp;nbsp;nbsp;nbsp;nbsp;nbsp;nbsp;nbsp; Additional exercises and selected solutions within the text, with all data sets and software available for download from the Web
·nbsp;nbsp;nbsp;nbsp;nbsp;nbsp;nbsp;nbsp; Sections on spatial models and model adequacy
The self-contained text units make MCMC accessible to scientists in other disciplines as well as statisticians. The book will appeal to everyone working with MCMC techniques, especially research and graduate statisticians and biostatisticians, and scientists handling data and formulating models. The book has been substantially reinforced as a first reading of material on MCMC and, consequently, as a textbook for modern Bayesian computation and Bayesian inference courses.
Table of Contents
Preface to the second edition | p. xiii |
Preface to the first edition | p. xv |
Introduction | p. 1 |
1 Stochastic simulation | p. 9 |
1.1 Introduction | p. 9 |
1.2 Generation of discrete random quantities | p. 10 |
1.2.1 Bernoulli distribution | p. 11 |
1.2.2 Binomial distribution | p. 11 |
1.2.3 Geometric and negative binomial distribution | p. 12 |
1.2.4 Poisson distribution | p. 12 |
1.3 Generation of continuous random quantities | p. 13 |
1.3.1 Probability integral transform | p. 13 |
1.3.2 Bivariate techniques | p. 14 |
1.3.3 Methods based on mixtures | p. 17 |
1.4 Generation of random vectors and matrices | p. 20 |
1.4.1 Multivariate normal distribution | p. 21 |
1.4.2 Wishart distribution | p. 23 |
1.4.3 Multivariate Student's t distribution | p. 24 |
1.5 Resampling methods | p. 25 |
1.5.1 Rejection method | p. 25 |
1.5.2 Weighted resampling method | p. 30 |
1.5.3 Adaptive rejection method | p. 32 |
1.6 Exercises | p. 34 |
2 Bayesian inference | p. 41 |
2.1 Introduction | p. 41 |
2.2 Bayes' theorem | p. 41 |
2.2.1 Prior, posterior and predictive distributions | p. 42 |
2.2.2 Summarizing the information | p. 47 |
2.3 Conjugate distributions | p. 49 |
2.3.1 Conjugate distributions for the exponential family | p. 51 |
2.3.2 Conjugacy and regression models | p. 55 |
2.3.3 Conditional conjugacy | p. 58 |
2.4 Hierarchical models | p. 60 |
2.5 Dynamic models | p. 63 |
2.5.1 Sequential inference | p. 64 |
2.5.2 Smoothing | p. 65 |
2.5.3 Extensions | p. 67 |
2.6 Spatial models | p. 68 |
2.7 Model comparison | p. 72 |
2.8 Exercises | p. 74 |
3 Approximate methods of inference | p. 81 |
3.1 Introduction | p. 81 |
3.2 Asymptotic approximations | p. 82 |
3.2.1 Normal approximations | p. 83 |
3.2.2 Mode calculation | p. 86 |
3.2.3 Standard Laplace approximation | p. 88 |
3.2.4 Exponential form Laplace approximations | p. 90 |
3.3 Approximations by Gaussian quadrature | p. 93 |
3.4 Monte Carlo integration | p. 95 |
3.5 Methods based on stochastic simulation | p. 98 |
3.5.1 Bayes' theorem via the rejection method | p. 100 |
3.5.2 Bayes' theorem via weighted resampling | p. 101 |
3.5.3 Application to dynamic models | p. 104 |
3.6 Exercises | p. 106 |
4 Markov chains | p. 113 |
4.1 Introduction | p. 113 |
4.2 Definition and transition probabilities | p. 114 |
4.3 Decomposition of the state space | p. 118 |
4.4 Stationary distributions | p. 121 |
4.5 Limiting theorems | p. 124 |
4.6 Reversible chains | p. 127 |
4.7 Continuous state spaces | p. 129 |
4.7.1 Transition kernels | p. 129 |
4.7.2 Stationarity and limiting results | p. 131 |
4.8 Simulation of a Markov chain | p. 132 |
4.9 Data augmentation or substitution sampling | p. 135 |
4.10 Exercises | p. 136 |
5 Gibbs sampling | p. 141 |
5.1 Introduction | p. 141 |
5.2 Definition and properties | p. 142 |
5.3 Implementation and optimization | p. 148 |
5.3.1 Forming the sample | p. 148 |
5.3.2 Scanning strategies | p. 150 |
5.3.3 Using the sample | p. 151 |
5.3.4 Reparametrization | p. 152 |
5.3.5 Blocking | p. 155 |
5.3.6 Sampling from the full conditional distributions | p. 156 |
5.4 Convergence diagnostics | p. 157 |
5.4.1 Rate of convergence | p. 158 |
5.4.2 Informal convergence monitors | p. 159 |
5.4.3 Convergence prescription | p. 161 |
5.4.4 Formal convergence methods | p. 164 |
5.5 Applications | p. 169 |
5.5.1 Hierarchical models | p. 169 |
5.5.2 Dynamic models | p. 172 |
5.5.3 Spatial models | p. 176 |
5.6 MCMC-based software for Bayesian modeling | p. 178 |
Appendix 5.A BUGS code for Example 5.7 | p. 182 |
Appendix 5.B BUGS code for Example 5.8 | p. 184 |
5.7 Exercises | p. 184 |
6 Metropolis-Hastings algorithms | p. 191 |
6.1 Introduction | p. 191 |
6.2 Definition and properties | p. 193 |
6.3 Special cases | p. 198 |
6.3.1 Symmetric chains | p. 198 |
6.3.2 Random walk chains | p. 198 |
6.3.3 Independence chains | p. 199 |
6.3.4 Other forms | p. 204 |
6.4 Hybrid algorithms | p. 205 |
6.4.1 Componentwise transition | p. 206 |
6.4.2 Metropolis within Gibbs | p. 211 |
6.4.3 Blocking | p. 214 |
6.4.4 Reparametrization | p. 216 |
6.5 Applications | p. 217 |
6.5.1 Generalized linear mixed models | p. 217 |
6.5.2 Dynamic linear models | p. 223 |
6.5.3 Dynamic generalized linear models | p. 226 |
6.5.4 Spatial models | p. 231 |
6.6 Exercises | p. 234 |
7 Further topics in MCMC | p. 237 |
7.1 Introduction | p. 237 |
7.2 Model adequacy | p. 237 |
7.2.1 Estimates of the predictive likelihood | p. 238 |
7.2.2 Uses of the predictive likelihood | p. 248 |
7.2.3 Deviance information criterion | p. 253 |
7.3 Model choice: MCMC over model and parameter spaces | p. 257 |
7.3.1 Markov chain for supermodels | p. 258 |
7.3.2 Markov chain with jumps | p. 261 |
7.3.3 Further issues related to RJMCMC algorithms | p. 270 |
7.4 Convergence acceleration | p. 271 |
7.4.1 Alterations to the chain | p. 271 |
7.4.2 Alterations to the equilibrium distribution | p. 278 |
7.4.3 Auxiliary variables | p. 282 |
7.5 Exercises | p. 284 |
References | p. 289 |
Author index | p. 311 |
Subject index | p. 316 |