Skip to:Content
|
Bottom
Cover image for Statistical theory : a concise introduction
Title:
Statistical theory : a concise introduction
Personal Author:
Series:
Chapman & Hall/CRC texts in statistical science.
Publication Information:
Boca Raton : CRC Press, 2013.
Physical Description:
xv, 224 p. : ill. ; 25 cm.
ISBN:
9781439851845
Abstract:
"Designed for a one-semester advanced undergraduate or graduate course, Statistical Theory: A Concise Introduction clearly explains the underlying ideas and principles of major statistical concepts, including parameter estimation, confidence intervals, hypothesis testing, asymptotic analysis, Bayesian inference, and elements of decision theory. It introduces these topics on a clear intuitive level using illustrative examples in addition to the formal definitions, theorems, and proofs. Based on the authors' lecture notes, this student-oriented, self-contained book maintains a proper balance between the clarity and rigor of exposition. In a few cases, the authors present a 'sketched' version of a proof, explaining its main ideas rather than giving detailed technical mathematical and probabilistic arguments. Chapters and sections marked by asterisks contain more advanced topics and may be omitted. A special chapter on linear models shows how the main theoretical concepts can be applied to the well-known and frequently used statistical tool of linear regression. Requiring no heavy calculus, simple questions throughout the text help students check their understanding of the material. Each chapter also includes a set of exercises that range in level of difficulty"-- Provided by publisher.

"Preface This book is intended as a textbook for a one-term course in statistical theory for advanced undergraduates in statistics, mathematics or other related fields although at least parts of it may be useful for graduates as well. Although there exist many good books on the topic, having taught a one-term Statistical Theory course during the years we felt that it is somewhat hard to recommend a particular one as a proper textbook to undergraduate students in statistics. Some of the existing textbooks with a primary focus on rigorous formalism, in our view, do not explain sufficiently clearly the underlying ideas and principles of the main statistical concepts, and are more suitable for graduates. Some others are "all-inclusive" textbooks that include a variety of topics in statistics that make them "too heavy" for a one-term course in statistical theory. Our main motivation was to propose a more "student-oriented" self-contained textbook designed for a one-term course on statistical theory that would introduce basic statistical concepts first on a clear intuitive level with illustrative examples in addition to the (necessary!) formal definitions, theorems and proofs. It is based on our lecture notes. We tried to keep a proper balance between the clarity and rigorousness of exposition. In a few cases we preferred to present a "sketched" version of a proof explaining its main ideas or even to give it up at all rather then to follow detailed technical mathematical and probabilistic arguments. The interested reader can complete those proofs from other existing books on mathematical statistics (see the bibliography)"-- Provided by publisher.
Added Author:

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010321445 QA276 A23 2013 Open Access Book Book
Searching...

On Order

Summary

Summary

Designed for a one-semester advanced undergraduate or graduate course, Statistical Theory: A Concise Introduction clearly explains the underlying ideas and principles of major statistical concepts, including parameter estimation, confidence intervals, hypothesis testing, asymptotic analysis, Bayesian inference, and elements of decision theory. It introduces these topics on a clear intuitive level using illustrative examples in addition to the formal definitions, theorems, and proofs.

Based on the authors' lecture notes, this student-oriented, self-contained book maintains a proper balance between the clarity and rigor of exposition. In a few cases, the authors present a "sketched" version of a proof, explaining its main ideas rather than giving detailed technical mathematical and probabilistic arguments. Chapters and sections marked by asterisks contain more advanced topics and may be omitted. A special chapter on linear models shows how the main theoretical concepts can be applied to the well-known and frequently used statistical tool of linear regression.

Requiring no heavy calculus, simple questions throughout the text help students check their understanding of the material. Each chapter also includes a set of exercises that range in level of difficulty.


Table of Contents

List of Figuresp. xi
List of Tablesp. xiii
Prefacep. xv
1 Introductionp. 1
1.1 Preamblep. 1
1.2 Likelihoodp. 4
1.3 Sufficiencyp. 6
1.4 *Minimal sufficiencyp. 9
1.5 * Completenessp. 11
1.6 Exponential family of distributionsp. 12
1.7 Exercisesp. 16
2 Point Estimationp. 19
2.1 Introductionp. 19
2.2 Maximum likelihood estimationp. 19
2.3 Method of momentsp. 25
2.4 Method of least squaresp. 26
2.5 Goodness-of-estimation. Mean squared errorp. 27
2.6 Unbiased estimationp. 29
2.6.1 Definition and main propertiesp. 29
2.6.2 Uniformly minimum variance unbiased estimators. The Cramer-Rao lower boundp. 31
2.6.3 *The Cramer-Rao lower bound for multivariate parametersp. 35
2.6.4 Rao-Blackwell theoremp. 37
2.6.5 *Lehmann-Scheffé theoremp. 40
2.7 Exercisesp. 40
3 Confidence Intervals, Bounds, and Regionsp. 45
3.1 Introductionp. 45
3.2 Quoting the estimation errorp. 46
3.3 Confidence intervalsp. 48
3.4 Confidence boundsp. 54
3.5 *Confidence regionsp. 54
3.6 Exercisesp. 58
4 Hypothesis Testingp. 61
4.1 Introductionp. 61
4.2 Simple hypothesesp. 62
4.2.1 Type I and Type II errorsp. 62
4.2.2 Choice of a critical valuep. 65
4.2.3 The p-valuep. 67
4.2.4 Maximal power tests. Neyman-Pearson lemmap. 69
4.3 Composite hypothesesp. 74
4.3.1 Power functionp. 74
4.3.2 Uniformly most powerful testsp. 77
4.3.3 Generalized likelihood ratio testsp. 81
4.4 Hypothesis testing and confidence intervalsp. 85
4.5 Sequential testingp. 87
4.6 Exercisesp. 91
5 Asymptotic Analysisp. 95
5.1 Introductionp. 95
5.2 Convergence and consistency in MSEp. 96
5.3 Convergence and consistency in probabilityp. 97
5.4 Convergence in distributionp. 101
5.5 The central limit theoremp. 103
5.6 Asymptotically normal consistencyp. 105
5.7 Asymptotic confidence intervalsp. 108
5.8 Asymptotically normal consistency of the MLE, Wald's confidence intervals, and testsp. 112
5.9 *Multiparameter casep. 114
5.10 Asymptotic distribution of the GLRT, Wilks' theoremp. 117
5.11 Exercisesp. 122
6 Bayesian Inferencep. 125
6.1 Introductionp. 125
6.2 Choice of priorsp. 128
6.2.1 Conjugate priorsp. 128
6.2.2 Noninformative (objective) priorsp. 129
6.3 Point estimationp. 133
6.4 Interval estimation. Credible setsp. 135
6.5 Hypothesis testingp. 136
6.5.1 Simple hypothesesp. 137
6.5.2 Composite hypothesesp. 138
6.5.3 Testing a point null hypothesisp. 140
6.6 Exercisesp. 141
7 *Elements of Statistical Decision Theoryp. 143
7.1 Introduction and notationsp. 143
7.2 Risk function and admissibilityp. 145
7.3 Minimax risk and minimax rulesp. 146
7.4 Bayes risk and Bayes rulesp. 147
7.5 Posterior expected loss and Bayes actionsp. 148
7.6 Admissibility and minimaxity of Bayes rulesp. 152
7.7 Exercisesp. 155
8 *Linear Modelsp. 157
8.1 Introductionp. 157
8.2 Definition and examplesp. 157
8.3 Estimation of regression coefficientsp. 160
8.4 Residuals. Estimation of the variancep. 164
8.5 Examplesp. 165
8.5.1 Estimation of a normal meanp. 165
8.5.2 Comparison between the means of two independent normal samples with a common variancep. 166
8.5.3 Simple linear regressionp. 166
8.6 Goodness-of-fit. Multiple correlation coefficientp. 167
8.7 Confidence intervals and regions for the coefficientsp. 168
8.8 Hypothesis testing in linear modelsp. 170
8.8.1 Testing significance of a single predictorp. 170
8.8.2 Testing significance of a group of predictorsp. 170
8.8.3 Testing a general linear hypothesisp. 172
8.9 Predictionsp. 174
8.10 Analysis of variancep. 175
8.10.1 One-way ANOVAp. 176
8.10.2 Two-way ANOVA and beyondp. 178
A Probabilistic Reviewp. 181
A.1 Introductionp. 181
A.2 Basic probabilistic lawsp. 181
A.3 Random variablesp. 182
A.3.1 Expected value and the variancep. 183
A.3.2 Chebyshev's and Markov's inequalitiesp. 185
A.3.3 Expectation of functions and the Jensen's inequalityp. 185
A.3.4 Joint distributionp. 186
A.3.5 Covariance, correlation, and the Cauchy-Schwarz inequalityp. 186
A.3.6 Expectation and variance of a sum of random variablesp. 187
A.3.7 Conditional distribution and Bayes Theoremp. 188
A.3.8 Distributions of functions of random variablesp. 188
A.3.9 Random vectorsp. 189
A.4 Special families of distributionsp. 191
A.4.1 Bernoulli and binomial distributionsp. 191
A.4.2 Geometric and negative binomial distributionsp. 192
A.4.3 Hypergeometric distributionp. 193
A.4.4 Poisson distributionp. 194
A.4.5 Uniform distributionp. 195
A.4.6 Exponential distributionp. 195
A.4.7 Weibull distributionp. 196
A.4.8 Gamma-distributionp. 197
A.4.9 Beta-distributionp. 197
A.4.10 Cauchy distributionp. 198
A.4.11 Normal distributionp. 198
A.4.12 Log-normal distributionp. 199
A.4.13 ¿ 2 distributionp. 199
A.4.14 t-distributionp. 200
A.4.15 F-distributionp. 200
A.4.16 Multinormal distributionp. 200
A.4.16.1 Definition and main propertiesp. 200
A.4.16.2 Projections of normal vectorsp. 202
B Solutions of Selected Exercisesp. 205
B.1 Chapter 1p. 205
B.2 Chapter 2p. 206
B.3 Chapter 3p. 208
B.4 Chapter 4p. 210
B.5 Chapter 5p. 212
B.6 Chapter 6p. 214
B.7 Chapter 7p. 215
Indexp. 221
Go to:Top of Page