Cover image for An introduction to bootstrap methods with applications to R
Title:
An introduction to bootstrap methods with applications to R
Personal Author:
Publication Information:
Hoboken, N.J. : Wiley, c2011
Physical Description:
xvii, 216 p. : ill. ; 25 cm.
ISBN:
9780470467046
Abstract:
"This book provides both an elementary and a modern introduction to the bootstrap for students who do not have an extensive background in advanced mathematics. It offers reliable, hands-on coverage of the bootstrap's considerable advantages -- as well as its drawbacks. The book outpaces the competition by skillfully presenting results on improved confidence set estimation, estimation of error rates in discriminant analysis, and applications to a wide variety of hypothesis testing and estimation problems. To alert readers to the limitations of the method, the book exhibits counterexamples to the consistency of bootstrap methods. The authors take great care to draw connections between the more traditional resampling methods and the bootstrap, oftentimes displaying helpful computer routines in R. Emphasis throughout the book is on the use of the bootstrap as an exploratory tool including its value in variable selection and other modeling environments"-- Provided by publisher.

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010297771 QA276.8 C444 2011 Open Access Book Book
Searching...

On Order

Summary

Summary

A comprehensive introduction to bootstrap methods in the R programming environment

Bootstrap methods provide a powerful approach to statistical data analysis, as they have more general applications than standard parametric methods. An Introduction to Bootstrap Methods with Applications to R explores the practicality of this approach and successfully utilizes R to illustrate applications for the bootstrap and other resampling methods. This book provides a modern introduction to bootstrap methods for readers who do not have an extensive background in advanced mathematics. Emphasis throughout is on the use of bootstrap methods as an exploratory tool, including its value in variable selection and other modeling environments.

The authors begin with a description of bootstrap methods and its relationship to other resampling methods, along with an overview of the wide variety of applications of the approach. Subsequent chapters offer coverage of improved confidence set estimation, estimation of error rates in discriminant analysis, and applications to a wide variety of hypothesis testing and estimation problems, including pharmaceutical, genomics, and economics. To inform readers on the limitations of the method, the book also exhibits counterexamples to the consistency of bootstrap methods.

An introduction to R programming provides the needed preparation to work with the numerous exercises and applications presented throughout the book. A related website houses the book's R subroutines, and an extensive listing of references provides resources for further study.

Discussing the topic at a remarkably practical and accessible level, An Introduction to Bootstrap Methods with Applications to R is an excellent book for introductory courses on bootstrap and resampling methods at the upper-undergraduate and graduate levels. It also serves as an insightful reference for practitioners working with data in engineering, medicine, and the social sciences who would like to acquire a basic understanding of bootstrap methods.


Author Notes

Michael R. Chernick, PhD, is Manager of Biostatistical Services at Lankenau Institute for Medical Research, where he conducts statistical design and analysis for pharmaceutical research. He has more than thirty years of experience in the application of statistical methods to such areas as medicine, energy, engineering, insurance, and pharmaceuticals. Dr. Chernick is the author of Bootstrap Methods: A Guide for Practitioners and Researchers, Second Edition and The Essentials of Biostatistics for Physicians, Nurses, and Clinicians, and the coauthor of Introductory Biostatistics for the Health Sciences: Modern Applications Including Bootstrap, all published by Wiley.
Robert A. LaBudde, PhD, is President of Least Cost Formulations, Ltd., a mathematical software development company that specializes in optimization and process control software for manufacturing companies. He has extensive experience in industry and academia and currently serves as Adjunct Associate Professor in the Department of Mathematics and Statistics at Old Dominion University.


Table of Contents

Prefacep. xi
Acknowledgmentsp. xv
List of Tablesp. xvii
1 Introductionp. 1
1.1 Historical Backgroundp. 1
1.2 Definition and Relationship to the Delta Method and Other Resampling Methodsp. 3
1.2.1 Jackknifep. 6
1.2.2 Delta Methodp. 7
1.2.3 Cross-Validationp. 7
1.2.4 Subsamplingp. 8
1.3 Wide Range of Applicationsp. 8
1.4 The Bootstrap and the R Language Systemp. 10
1.5 Historical Notesp. 25
1.6 Exercisesp. 26
Referencesp. 27
2 Estimationp. 30
2.1 Estimating Biasp. 30
2.1.1 Bootstrap Adjustmentp. 30
2.1.2 Error Rate Estimation in Discriminant Analysisp. 32
2.1.3 Simple Example of Linear Discrimination and Bootstrap Error Rate Estimationp. 42
2.1.4 Patch Data Examplep. 51
2.2 Estimating Locationp. 53
2.2.1 Estimating a Meanp. 53
2.2.2 Estimating a Medianp. 54
2.3 Estimating Dispersionp. 54
2.3.1 Estimating an Estimate's Standard Errorp. 55
2.3.2 Estimating Interquartile Rangep. 56
2.4 Linear Regressionp. 56
2.4.1 Overviewp. 56
2.4.2 Bootstrapping Residualsp. 57
2.4.3 Bootstrapping Pairs (Response and Predictor Vector)p. 58
2.4.4 Heteroscedasticity of Variance: The Wild Bootstrapp. 58
2.4.5 A Special Class of Linear Regression Models: Multivariable Fractional Polynomialsp. 60
2.5 Nonlinear Regressionp. 60
2.5.1 Examples of Nonlinear Modelsp. 61
2.5.2 A Quasi-Optical Experimentp. 63
2.6 Nonparametric Regressionp. 63
2.6.1 Examples of Nonparametric Regression Modelsp. 64
2.6.2 Bootstrap Baggingp. 66
2.7 Historical Notesp. 67
2.8 Exercisesp. 69
Referencesp. 71
3 Confidence Intervalsp. 76
3.1 Subsampling, Typical Value Theorem, and Efron's Percentile Methodp. 77
3.2 Bootstrap-tp. 79
3.3 Iterated Bootstrapp. 83
3.4 Bias-Corrected (BC) Bootstrapp. 85
3.5 BCa and ABCp. 85
3.6 Tilted Bootstrapp. 88
3.7 Variance Estimation with Small Sample Sizesp. 90
3.8 Historical Notesp. 94
3.9 Exercisesp. 96
Referencesp. 98
4 Hypothesis Testingp. 101
4.1 Relationship to Confidence Intervalsp. 103
4.2 Why Test Hypotheses Differently?p. 105
4.3 Tendril DX Examplep. 106
4.4 Klingenberg Example: Binary Dose-Responsep. 108
4.5 Historical Notesp. 109
4.6 Exercisesp. 110
Referencesp. 111
5 Time Seriesp. 113
5.1 Forecasting Methodsp. 113
5.2 Time Domain Modelsp. 114
5.3 Can Bootstrapping Improve Prediction Intervals?p. 115
5.4 Model-Based Methodsp. 118
5.4.1 Bootstrapping Stationary Autoregressive Processesp. 118
5.4.2 Bootstrapping Explosive Autoregressive Processesp. 123
5.4.3 Bootstrapping Unstable Autoregressive Processesp. 123
5.4.4 Bootstrapping Stationary ARMA Processesp. 123
5.5 Block Bootstrapping for Stationary Time Seriesp. 123
5.6 Dependent Wild Bootstrap (DWB)p. 126
5.7 Frequency-Based Approaches for Stationary Time Seriesp. 127
5.8 Sieve Bootstrapp. 128
5.9 Historical Notesp. 129
5.10 Exercisesp. 131
Referencesp. 131
6 Bootstrap Variantsp. 136
6.1 Bayesian Bootstrapp. 137
6.2 Smoothed Bootstrapp. 138
6.3 Parametric Bootstrapp. 139
6.4 Double Bootstrapp. 139
6.5 The m-Out-of-n Bootstrapp. 140
6.6 The Wild Bootstrapp. 141
6.7 Historical Notesp. 141
6.8 Exercisesp. 142
Referencesp. 142
7 Chapter Special Topicsp. 144
7.1 Spatial Datap. 144
7.1.1 Krigingp. 144
7.1.2 Asymptotics for Spatial Datap. 147
7.1.3 Block Bootstrap on Regular Gridsp. 148
7.1.4 Block Bootstrap on Irregular Gridsp. 148
7.2 Subset Selection in Regressionp. 148
7.2.1 Gong's Logistic Regression Examplep. 149
7.2.2 Gunter's Qualitative Interaction Examplep. 153
7.3 Determining the Number of Distributions in a Mixturep. 155
7.4 Censored Datap. 157
7.5 P-Value Adjustmentp. 158
7.5.1 The Westfall-Young Approachp. 159
7.5.2 Passive Plus Examplep. 159
7.5.3 Consulting Examplep. 160
7.6 Bioequivalencep. 162
7.6.1 Individual Bioequivalencep. 162
7.6.2 Population Bioequivalencep. 165
7.7 Process Capability Indicesp. 165
7.8 Missing Datap. 172
7.9 Point Processesp. 174
7.10 Bootstrap to Detect Outliersp. 176
7.11 Lattice Variablesp. 177
7.12 Covariate Adjustment of Area Under the Curve Estimates for Receiver Operating Characteristic (ROC) Curvesp. 177
7.13 Bootstrapping in SASp. 179
7.14 Historical Notesp. 182
7.15 Exercisesp. 183
Referencesp. 185
8 When the Bootstrap is Inconsistent and How to Remedy Itp. 190
8.1 Too Small of a Sample Sizep. 191
8.2 Distributions with Infinite Second Momentsp. 191
8.2.1 Introductionp. 191
8.2.2 Example of Inconsistencyp. 192
8.2.3 Remediesp. 193
8.3 Estimating Extreme Valuesp. 194
8.3.1 Introductionp. 194
8.3.2 Example of Inconsistencyp. 194
8.3.3 Remediesp. 194
8.4 Survey Samplingp. 195
8.4.1 Introductionp. 195
8.4.2 Example of Inconsistencyp. 195
8.4.3 Remediesp. 195
8.5 m-Dependent Sequencesp. 196
8.5.1 Introductionp. 196
8.5.2 Example of Inconsistency When Independence Is Assumedp. 196
8.5.3 Remedyp. 197
8.6 Unstable Autoregressive Processesp. 197
8.6.1 Introductionp. 197
8.6.2 Example of Inconsistencyp. 197
8.6.3 Remediesp. 197
8.7 Long-Range Dependencep. 198
8.7.1 Introductionp. 198
8.7.2 Example of Inconsistencyp. 198
8.7.3 A Remedyp. 198
8.8 Bootstrap Diagnosticsp. 199
8.9 Historical Notesp. 199
8.10 Exercisesp. 201
Referencesp. 201
Author Indexp. 204
Subject Indexp. 210