Available:*
Library | Item Barcode | Call Number | Material Type | Item Category 1 | Status |
---|---|---|---|---|---|
Searching... | 30000010307164 | QA276.4 G58 2013 | Open Access Book | Book | Searching... |
On Order
Summary
Summary
This new edition continues to serve as a comprehensive guide to modern and classical methods of statistical computing. The book is comprised of four main parts spanning the field:
Optimization Integration and Simulation Bootstrapping Density Estimation and SmoothingWithin these sections,each chapter includes a comprehensive introduction and step-by-step implementation summaries to accompany the explanations of key methods. The new edition includes updated coverage and existing topics as well as new topics such as adaptive MCMC and bootstrapping for correlated data. The book website now includes comprehensive R code for the entire book. There are extensive exercises, real examples, and helpful insights about how to use the methods in practice.
Author Notes
GEOF H. GIVENS, PhD, is Associate Professor in the Department of Statistics at Colorado State University. He serves as Associate Editor for Computational Statistics and Data Analysis. His research interests include statistical problems in wildlife conservation biology including ecology, population modeling and management, and automated computer face recognition.
JENNIFER A. HOETING, PhD, is Professor in the Department of Statistics at Colorado State University. She is an award-winning teacher who co-leads large research efforts for the National Science Foundation. She has served as associate editor for the Journal of the American Statistical Association and Environmetrics. Her research interests include spatial statistics, Bayesian methods, and model selection.
Givens and Hoeting have taught graduate courses on computational statistics for nearly twenty years, and short courses to leading statisticians and scientists around the world.
Table of Contents
Preface | p. xv |
Acknowledgments | p. xix |
1 Review | p. 1 |
1.1 Mathematical notation | p. 1 |
1.2 Taylor's theorem and mathematical limit theory | p. 2 |
1.3 Statistical notation and probability distributions | p. 4 |
1.4 Likelihood inference | p. 6 |
1.5 Bayesian inference | p. 11 |
1.6 Statistical limit theory | p. 13 |
1.7 Markov chains | p. 14 |
1.8 Computing | p. 17 |
Part I Optimization | |
2 Optimization and Solving Nonlinear Equations | p. 3 |
2.1 Univariate problems | p. 5 |
2.2 Multivariate problems | p. 17 |
Problems | p. 36 |
3 Combinatorial Optimization | p. 43 |
3.1 Hard problems and NP-completeness | p. 44 |
3.2 Local search | p. 50 |
3.3 Simulated annealing | p. 53 |
3.4 Genetic algorithms | p. 60 |
3.5 Tabu algorithms | p. 71 |
Problems | p. 78 |
4 EM Optimization Methods | p. 83 |
4.1 Missing data, marginalization, and notation | p. 84 |
4.2 The EM algorithm | p. 84 |
4.3 EM Variants | p. 98 |
Problems | p. 108 |
Part II Integration and Simulation | |
5 Numerical Integration | p. 117 |
5.1 Newton-Côtes quadrature | p. 118 |
5.2 Romberg integration | p. 127 |
5.3 Gaussian quadrature | p. 131 |
5.4 Frequently encountered problems | p. 135 |
Problems | p. 137 |
6 Simulation and Monte Carlo Integration | p. 139 |
6.1 Introduction to the Monte Carlo method | p. 140 |
6.3 Approximate Simulation | p. 152 |
6.4 Variance reduction techniques | p. 170 |
Problems | p. 185 |
7 Markov Chain Monte Carlo | p. 191 |
7.1 Metropolis-Hastings algorithm | p. 192 |
7.2 Gibbs sampling | p. 199 |
7.3 Implementation | p. 210 |
Problems | p. 222 |
8 Advanced Topics in MCMC | p. 229 |
8.1 Adaptive MCMC | p. 229 |
8.2 Reversible Jump MCMC | p. 243 |
8.3 Auxiliary variable methods | p. 250 |
8.4 Other Metropolis Hastings Algorithms | p. 254 |
8.5 Perfect sampling | p. 258 |
8.6 Markov chain maximum likelihood | p. 262 |
8.7 Example: MCMC for Markov random fields | p. 263 |
Problems | p. 274 |
Part III Approximating Distributions | |
9 Bootstrapping | p. 281 |
9.1 The bootstrap principle | p. 281 |
9.2 Basic methods | p. 283 |
9.3 Bootstrap inference | p. 286 |
9.4 Reducing Monte Carlo error | p. 297 |
9.5 Bootstrapping dependent data | p. 298 |
9.6 Bootstrap performance | p. 310 |
9.7 Other uses of the bootstrap | p. 312 |
9.8 Permutation tests | p. 313 |
Problems | p. 314 |
Part IV Density Estimation And Smoothing | |
10 Nonparametric Density Estimation | p. 321 |
10.1 Measures of performance | p. 322 |
10.2 Kernel density estimation | p. 324 |
10.3 Nonkernel methods | p. 338 |
10.4 Multivariate methods | p. 341 |
Problems | p. 356 |
11 Bivariate Smoothing | p. 361 |
11.1 Predictor-response data | p. 362 |
11.2 Linear smoothers | p. 364 |
11.3 Comparison of linear smoothers | p. 376 |
11.4 Nonlinear smoothers | p. 378 |
11.5 Confidence bands | p. 385 |
11.6 General bivariate data | p. 388 |
Problems | p. 389 |
12 Multivariate Smoothing | p. 393 |
12.1 Predictor-response data | p. 393 |
12.2 General multivariate data | p. 415 |
Problems | p. 419 |
Data Acknowledgments | p. 423 |
References | p. 425 |
Index | p. 453 |