Cover image for Markov chain Monte Carlo : stochastic simulation for Bayesian inference
Title:
Markov chain Monte Carlo : stochastic simulation for Bayesian inference
Personal Author:
Series:
Texts in statistical science ; 68
Edition:
2nd ed.
Publication Information:
Boca Raton, FL : Taylor & Francis, 2006
ISBN:
9781584885870

Available:*

Library
Item Barcode
Call Number
Material Type
Status
Searching...
30000010124246 QA279.5 G35 2006 Open Access Book
Searching...

On Order

Summary

Summary

While there have been few theoretical contributions on the Markov Chain Monte Carlo (MCMC) methods in the past decade, current understanding and application of MCMC to the solution of inference problems has increased by leaps and bounds. Incorporating changes in theory and highlighting new applications, Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference, Second Edition presents a concise, accessible, and comprehensive introduction to the methods of this valuable simulation technique. The second edition includes access to an internet site that provides the code, written in R and WinBUGS, used in many of the previously existing and new examples and exercises. More importantly, the self-explanatory nature of the codes will enable modification of the inputs to the codes and variation on many directions will be available for further exploration.

Major changes from the previous edition:

¿ More examples with discussion of computational details in chapters on Gibbs sampling and Metropolis-Hastings algorithms

¿ Recent developments in MCMC, including reversible jump, slice sampling, bridge sampling, path sampling, multiple-try, and delayed rejection

¿ Discussion of computation using both R and WinBUGS

¿ Additional exercises and selected solutions within the text, with all data sets and software available for download from the Web

¿ Sections on spatial models and model adequacy

The self-contained text units make MCMC accessible to scientists in other disciplines as well as statisticians. The book will appeal to everyone working with MCMC techniques, especially research and graduate statisticians and biostatisticians, and scientists handling data and formulating models. The book has been substantially reinforced as a first reading of material on MCMC and, consequently, as a textbook for modern Bayesian computation and Bayesian inference courses.


Table of Contents

Preface to the second editionp. xiii
Preface to the first editionp. xv
Introductionp. 1
1 Stochastic simulationp. 9
1.1 Introductionp. 9
1.2 Generation of discrete random quantitiesp. 10
1.2.1 Bernoulli distributionp. 11
1.2.2 Binomial distributionp. 11
1.2.3 Geometric and negative binomial distributionp. 12
1.2.4 Poisson distributionp. 12
1.3 Generation of continuous random quantitiesp. 13
1.3.1 Probability integral transformp. 13
1.3.2 Bivariate techniquesp. 14
1.3.3 Methods based on mixturesp. 17
1.4 Generation of random vectors and matricesp. 20
1.4.1 Multivariate normal distributionp. 21
1.4.2 Wishart distributionp. 23
1.4.3 Multivariate Student's t distributionp. 24
1.5 Resampling methodsp. 25
1.5.1 Rejection methodp. 25
1.5.2 Weighted resampling methodp. 30
1.5.3 Adaptive rejection methodp. 32
1.6 Exercisesp. 34
2 Bayesian inferencep. 41
2.1 Introductionp. 41
2.2 Bayes' theoremp. 41
2.2.1 Prior, posterior and predictive distributionsp. 42
2.2.2 Summarizing the informationp. 47
2.3 Conjugate distributionsp. 49
2.3.1 Conjugate distributions for the exponential familyp. 51
2.3.2 Conjugacy and regression modelsp. 55
2.3.3 Conditional conjugacyp. 58
2.4 Hierarchical modelsp. 60
2.5 Dynamic modelsp. 63
2.5.1 Sequential inferencep. 64
2.5.2 Smoothingp. 65
2.5.3 Extensionsp. 67
2.6 Spatial modelsp. 68
2.7 Model comparisonp. 72
2.8 Exercisesp. 74
3 Approximate methods of inferencep. 81
3.1 Introductionp. 81
3.2 Asymptotic approximationsp. 82
3.2.1 Normal approximationsp. 83
3.2.2 Mode calculationp. 86
3.2.3 Standard Laplace approximationp. 88
3.2.4 Exponential form Laplace approximationsp. 90
3.3 Approximations by Gaussian quadraturep. 93
3.4 Monte Carlo integrationp. 95
3.5 Methods based on stochastic simulationp. 98
3.5.1 Bayes' theorem via the rejection methodp. 100
3.5.2 Bayes' theorem via weighted resamplingp. 101
3.5.3 Application to dynamic modelsp. 104
3.6 Exercisesp. 106
4 Markov chainsp. 113
4.1 Introductionp. 113
4.2 Definition and transition probabilitiesp. 114
4.3 Decomposition of the state spacep. 118
4.4 Stationary distributionsp. 121
4.5 Limiting theoremsp. 124
4.6 Reversible chainsp. 127
4.7 Continuous state spacesp. 129
4.7.1 Transition kernelsp. 129
4.7.2 Stationarity and limiting resultsp. 131
4.8 Simulation of a Markov chainp. 132
4.9 Data augmentation or substitution samplingp. 135
4.10 Exercisesp. 136
5 Gibbs samplingp. 141
5.1 Introductionp. 141
5.2 Definition and propertiesp. 142
5.3 Implementation and optimizationp. 148
5.3.1 Forming the samplep. 148
5.3.2 Scanning strategiesp. 150
5.3.3 Using the samplep. 151
5.3.4 Reparametrizationp. 152
5.3.5 Blockingp. 155
5.3.6 Sampling from the full conditional distributionsp. 156
5.4 Convergence diagnosticsp. 157
5.4.1 Rate of convergencep. 158
5.4.2 Informal convergence monitorsp. 159
5.4.3 Convergence prescriptionp. 161
5.4.4 Formal convergence methodsp. 164
5.5 Applicationsp. 169
5.5.1 Hierarchical modelsp. 169
5.5.2 Dynamic modelsp. 172
5.5.3 Spatial modelsp. 176
5.6 MCMC-based software for Bayesian modelingp. 178
Appendix 5.A BUGS code for Example 5.7p. 182
Appendix 5.B BUGS code for Example 5.8p. 184
5.7 Exercisesp. 184
6 Metropolis-Hastings algorithmsp. 191
6.1 Introductionp. 191
6.2 Definition and propertiesp. 193
6.3 Special casesp. 198
6.3.1 Symmetric chainsp. 198
6.3.2 Random walk chainsp. 198
6.3.3 Independence chainsp. 199
6.3.4 Other formsp. 204
6.4 Hybrid algorithmsp. 205
6.4.1 Componentwise transitionp. 206
6.4.2 Metropolis within Gibbsp. 211
6.4.3 Blockingp. 214
6.4.4 Reparametrizationp. 216
6.5 Applicationsp. 217
6.5.1 Generalized linear mixed modelsp. 217
6.5.2 Dynamic linear modelsp. 223
6.5.3 Dynamic generalized linear modelsp. 226
6.5.4 Spatial modelsp. 231
6.6 Exercisesp. 234
7 Further topics in MCMCp. 237
7.1 Introductionp. 237
7.2 Model adequacyp. 237
7.2.1 Estimates of the predictive likelihoodp. 238
7.2.2 Uses of the predictive likelihoodp. 248
7.2.3 Deviance information criterionp. 253
7.3 Model choice: MCMC over model and parameter spacesp. 257
7.3.1 Markov chain for supermodelsp. 258
7.3.2 Markov chain with jumpsp. 261
7.3.3 Further issues related to RJMCMC algorithmsp. 270
7.4 Convergence accelerationp. 271
7.4.1 Alterations to the chainp. 271
7.4.2 Alterations to the equilibrium distributionp. 278
7.4.3 Auxiliary variablesp. 282
7.5 Exercisesp. 284
Referencesp. 289
Author indexp. 311
Subject indexp. 316