Skip to:Content
|
Bottom
Cover image for Bayesian methods for nonlinear classification and regression
Title:
Bayesian methods for nonlinear classification and regression
Publication Information:
Chichester, West Sussex : John Wiley & Sons, 2002
ISBN:
9780471490364
Added Author:

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000004301143 QA279.5 B394 2002 Open Access Book Book
Searching...

On Order

Summary

Summary

Nonlinear Bayesian modelling is a relatively new field, but one that has seen a recent explosion of interest. Nonlinear models offer more flexibility than those with linear assumptions, and their implementation has now become much easier due to increases in computational power. Bayesian methods allow for the incorporation of prior information, allowing the user to make coherent inference. Bayesian Methods for Nonlinear Classification and Regression is the first book to bring together, in a consistent statistical framework, the ideas of nonlinear modelling and Bayesian methods.
* Focuses on the problems of classification and regression using flexible, data-driven approaches.
* Demonstrates how Bayesian ideas can be used to improve existing statistical methods.
* Includes coverage of Bayesian additive models, decision trees, nearest-neighbour, wavelets, regression splines, and neural networks.
* Emphasis is placed on sound implementation of nonlinear models.
* Discusses medical, spatial, and economic applications.
* Includes problems at the end of most of the chapters.
* Supported by a web site featuring implementation code and data sets.
Primarily of interest to researchers of nonlinear statistical modelling, the book will also be suitable for graduate students of statistics. The book will benefit researchers involved inregression and classification modelling from electrical engineering, economics, machine learning and computer science.


Author Notes

David G. T. Denison and Christopher C. Holmes are the authors of Bayesian Methods for Nonlinear Classification and Regression, published by Wiley.


Table of Contents

Prefacep. xi
Acknowledgementsp. xiii
1 Introductionp. 1
1.1 Regression and Classificationp. 1
1.2 Bayesian Nonlinear Methodsp. 4
1.2.1 Approximating functionsp. 4
1.2.2 The 'best' modelp. 4
1.2.3 Bayesian methodsp. 5
1.3 Outline of the Bookp. 5
2 Bayesian Modellingp. 9
2.1 Introductionp. 9
2.2 Data Modellingp. 9
2.2.1 The representation theorem for classificationp. 9
2.2.2 The general representation theoremp. 10
2.2.3 Bayes' Theoremp. 11
2.2.4 Modelling with predictorsp. 12
2.3 Basics of Regression Modellingp. 14
2.3.1 The regression problemp. 14
2.3.2 Basis function models for the regression functionp. 14
2.4 The Bayesian Linear Modelp. 15
2.4.1 The priorsp. 16
2.4.2 The likelihoodp. 17
2.4.3 The posteriorp. 17
2.5 Model Comparisonp. 18
2.5.1 Bayes' factorsp. 19
2.5.2 Occam's razorp. 20
2.5.3 Lindley's paradoxp. 22
2.6 Model Selectionp. 24
2.6.1 Searching for modelsp. 25
2.7 Model Averagingp. 28
2.7.1 Predictive inferencep. 28
2.7.2 Problems with model selectionp. 30
2.7.3 Other work on model averagingp. 31
2.8 Posterior Samplingp. 31
2.8.1 The Gibbs samplerp. 33
2.8.2 The Metropolis-Hastings algorithmp. 34
2.8.3 The reversible jump algorithmp. 36
2.8.4 Hybrid samplingp. 39
2.8.5 Convergencep. 40
2.9 Further Readingp. 41
2.10 Problemsp. 42
3 Curve Fittingp. 45
3.1 Introductionp. 45
3.2 Curve Fitting Using Step Functionsp. 46
3.2.1 Example: Nile discharge datap. 46
3.3 Curve Fitting with Splinesp. 51
3.3.1 Metropolis-Hastings samplerp. 53
3.3.2 Gibbs samplingp. 56
3.3.3 Example: Great Barrier Reef Datap. 57
3.3.4 Monitoring convergence of the samplerp. 60
3.3.5 Default curve fittingp. 63
3.4 Curve Fitting Using Waveletsp. 66
3.4.1 Wavelet shrinkagep. 69
3.4.2 Bayesian waveletsp. 70
3.5 Prior Elicitationp. 72
3.5.1 The model priorp. 73
3.5.2 Prior on the model parametersp. 78
3.5.3 The prior on the coefficientsp. 79
3.5.4 The prior on the regression variancep. 82
3.6 Robust Curve Fittingp. 82
3.6.1 Modelling with a heavy-tailed error distributionp. 83
3.6.2 Outlier detection modelsp. 86
3.7 Discussionp. 88
3.8 Further Readingp. 89
3.9 Problemsp. 91
4 Surface Fittingp. 95
4.1 Introductionp. 95
4.2 Additive Modelsp. 95
4.2.1 Introduction to additive modellingp. 95
4.2.2 Ozone data examplep. 98
4.2.3 Further reading on Bayesian additive modelsp. 99
4.3 Higher-Order Splinesp. 100
4.3.1 Truncated linear splinesp. 100
4.4 High-Dimensional Regressionp. 102
4.4.1 Extending to higher dimensionp. 102
4.4.2 The BWISE modelp. 103
4.4.3 The BMARS modelp. 103
4.4.4 Piecewise linear modelsp. 110
4.4.5 Neural network modelsp. 115
4.5 Time Series Analysisp. 119
4.5.1 The BAYSTAR modelp. 121
4.5.2 Example: Wolf's sunspots datap. 122
4.5.3 Chaotic Time Seriesp. 124
4.6 Further Readingp. 126
4.7 Problemsp. 126
5 Classification Using Generalised Nonlinear Modelsp. 129
5.1 Introductionp. 129
5.2 Nonlinear Models for Classificationp. 130
5.2.1 Classificationp. 130
5.2.2 Auxiliary variables method for classificationp. 132
5.3 Bayesian MARS for Classificationp. 136
5.3.1 Multiclass classificationp. 137
5.4 Count Datap. 138
5.4.1 Example: Rongelap Island datasetp. 140
5.5 The Generalised Linear Model Frameworkp. 141
5.5.1 Bayesian generalised linear modelsp. 144
5.5.2 Log-concavityp. 144
5.6 Further Readingp. 145
5.7 Problemsp. 146
6 Bayesian Tree Modelsp. 149
6.1 Introductionp. 149
6.1.1 Motivation for treesp. 150
6.1.2 Binary-tree structurep. 150
6.2 Bayesian Treesp. 152
6.2.1 The random tree structurep. 152
6.2.2 Classification treesp. 153
6.2.3 Regression treesp. 155
6.2.4 Prior on treesp. 156
6.3 Simple Treesp. 158
6.3.1 Stumpsp. 159
6.3.2 A Bayesian splitting criterionp. 160
6.4 Searching for Large Treesp. 161
6.4.1 The sampling algorithmp. 161
6.4.2 Problems with samplingp. 164
6.4.3 Improving the generated 'sample'p. 165
6.5 Classification Using Bayesian Treesp. 166
6.5.1 The Pima Indian datasetp. 166
6.5.2 Selecting trees from the samplep. 167
6.5.3 Summarising the outputp. 167
6.5.4 Identifying good treesp. 169
6.6 Discussionp. 170
6.7 Further Readingp. 174
6.8 Problemsp. 175
7 Partition Modelsp. 177
7.1 Introductionp. 177
7.2 One-Dimensional Partition Modelsp. 179
7.2.1 Changepoint modelsp. 182
7.3 Multidimensional Partition Modelsp. 184
7.3.1 Tessellationsp. 184
7.3.2 Marginal likelihoods for partition modelsp. 186
7.3.3 Prior on the model structurep. 187
7.3.4 Computational strategyp. 188
7.4 Classification with Partition Modelsp. 188
7.4.1 Speech recognition datasetp. 188
7.5 Disease Mapping with Partition Modelsp. 191
7.5.1 Introductionp. 191
7.5.2 The disease mapping problemp. 192
7.5.3 The binomial model for disease riskp. 192
7.5.4 The Poisson model for disease riskp. 193
7.5.5 Example: leukaemia incidence datap. 193
7.5.6 Convergence assessmentp. 195
7.5.7 Posterior inference for the leukaemia datap. 197
7.6 Discussionp. 199
7.7 Further Readingp. 203
7.8 Problemsp. 206
8 Nearest-Neighbour Modelsp. 209
8.1 Introductionp. 209
8.2 Nearest-Neighbour Classificationp. 209
8.3 Probabilistic Nearest Neighbourp. 211
8.3.1 Formulationp. 211
8.3.2 Implementationp. 213
8.4 Examplesp. 214
8.4.1 Ripley's simulated datap. 214
8.4.2 Arm tremor datap. 216
8.4.3 Lancing Woods datap. 217
8.5 Discussionp. 219
8.6 Further Readingp. 220
9 Multiple Response Modelsp. 221
9.1 Introductionp. 221
9.2 The Multiple Response Modelp. 221
9.3 Conjugate Multivariate Linear Regressionp. 222
9.4 Seemingly Unrelated Regressionsp. 223
9.4.1 Prior on the basis function matrixp. 226
9.5 Computational Detailsp. 227
9.5.1 Updating the parameter vector [theta]p. 227
9.6 Examplesp. 228
9.6.1 Vector autoregressive processesp. 229
9.6.2 Multiple curve fittingp. 230
9.7 Discussionp. 234
Appendix A Probability Distributionsp. 237
Appendix B Inferential Processesp. 239
B.1 The Linear Modelp. 240
B.2 Multivariate Linear Modelp. 241
B.3 Exponential-Gamma Modelp. 242
B.4 The Multinomial-Dirichlet Modelp. 243
B.5 Poisson-Gamma Modelp. 244
B.6 Uniform-Pareto Modelp. 245
Referencesp. 247
Indexp. 265
Author Indexp. 271
Go to:Top of Page