Cover image for Probability and statistical inference
Title:
Probability and statistical inference
Personal Author:
Series:
Statistics, textbooks and monographs ; 162

Statistics, textbooks and monographs ; v. 162
Publication Information:
London : CRC, c2000
Physical Description:
xviii, 665 p. : ill. ; 24 cm.
ISBN:
9780824703790

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010316651 QA273 M854 2000 Open Access Book Book
Searching...

On Order

Summary

Summary

Priced very competitively compared with other textbooks at this level!
This gracefully organized textbook reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, numerous figures and tables, and computer simulations to develop and illustrate concepts.

Beginning with an introduction to the basic ideas and techniques in probability theory and progressing to more rigorous topics, Probability and Statistical Inference
studies the Helmert transformation for normal distributions and the waiting time between failures for exponential distributions
develops notions of convergence in probability and distribution
spotlights the central limit theorem (CLT) for the sample variance
introduces sampling distributions and the Cornish-Fisher expansions
concentrates on the fundamentals of sufficiency, information, completeness, and ancillarity
explains Basu's Theorem as well as location, scale, and location-scale families of distributions
covers moment estimators, maximum likelihood estimators (MLE), Rao-Blackwellization, and the Cramér-Rao inequality
discusses uniformly minimum variance unbiased estimators (UMVUE) and Lehmann-Scheffé Theorems
focuses on the Neyman-Pearson theory of most powerful (MP) and uniformly most powerful (UMP) tests of hypotheses, as well as confidence intervals
includes the likelihood ratio (LR) tests for the mean, variance, and correlation coefficient
summarizes Bayesian methods
describes the monotone likelihood ratio (MLR) property
handles variance stabilizing transformations
provides a historical context for statistics and statistical discoveries
showcases great statisticians through biographical notes

Employing over 1400 equations to reinforce its subject matter, Probability and Statistical Inference is a groundbreaking text for first-year graduate and upper-level undergraduate courses in probability and statistical inference who have completed a calculus prerequisite, as well as a supplemental text for classes in Advanced Statistical Inference or Decision Theory.


Table of Contents

Prefacep. v
Acknowledgmentsp. xi
1 Notions of Probabilityp. 1
1.1 Introductionp. 1
1.2 About Setsp. 3
1.3 Axiomatic Development of Probabilityp. 6
1.4 The Conditional Probability and Independent Eventsp. 9
1.4.1 Calculus of Probabilityp. 12
1.4.2 Bayes's Theoremp. 14
1.4.3 Selected Counting Rulesp. 16
1.5 Discrete Random Variablesp. 18
1.5.1 Probability Mass and Distribution Functionsp. 19
1.6 Continuous Random Variablesp. 23
1.6.1 Probability Density and Distribution Functionsp. 23
1.6.2 The Median of a Distributionp. 28
1.6.3 Selected Reviews from Mathematicsp. 28
1.7 Some Standard Probability Distributionsp. 32
1.7.1 Discrete Distributionsp. 33
1.7.2 Continuous Distributionsp. 37
1.8 Exercises and Complementsp. 50
2 Expectations of Functions of Random Variablesp. 65
2.1 Introductionp. 65
2.2 Expectation and Variancep. 65
2.2.1 The Bernoulli Distributionp. 71
2.2.2 The Binomial Distributionp. 72
2.2.3 The Poisson Distributionp. 73
2.2.4 The Uniform Distributionp. 73
2.2.5 The Normal Distributionp. 73
2.2.6 The Laplace Distributionp. 76
2.2.7 The Gamma Distributionp. 76
2.3 The Moments and Moment Generating Functionp. 77
2.3.1 The Binomial Distributionp. 80
2.3.2 The Poisson Distributionp. 81
2.3.3 The Normal Distributionp. 82
2.3.4 The Gamma Distributionp. 84
2.4 Determination of a Distribution via MGFp. 86
2.5 The Probability Generating Functionp. 88
2.6 Exercises and Complementsp. 89
3 Multivariate Random Variablesp. 99
3.1 Introductionp. 99
3.2 Discrete Distributionsp. 100
3.2.1 The Joint, Marginal and Conditional Distributionsp. 101
3.2.2 The Multinomial Distributionp. 103
3.3 Continuous Distributionsp. 107
3.3.1 The Joint, Marginal and Conditional Distributionsp. 107
3.3.2 Three and Higher Dimensionsp. 115
3.4 Covariances and Correlation Coefficientsp. 119
3.4.1 The Multinomial Casep. 124
3.5 Independence of Random Variablesp. 125
3.6 The Bivariate Normal Distributionp. 131
3.7 Correlation Coefficient and Independencep. 139
3.8 The Exponential Family of Distributionsp. 141
3.8.1 One-parameter Situationp. 141
3.8.2 Multi-parameter Situationp. 144
3.9 Some Standard Probability Inequalitiesp. 145
3.9.1 Markov and Bernstein-Chernoff Inequalitiesp. 145
3.9.2 Tchebysheff's Inequalityp. 148
3.9.3 Cauchy-Schwarz and Covariance Inequalitiesp. 149
3.9.4 Jensen's and Lyapunov's Inequalitiesp. 152
3.9.5 Holder's Inequalityp. 156
3.9.6 Bonferroni Inequalityp. 157
3.9.7 Central Absolute Moment Inequalityp. 158
3.10 Exercises and Complementsp. 159
4 Functions of Random Variables and Sampling Distributionp. 177
4.1 Introductionp. 177
4.2 Using Distribution Functionsp. 179
4.2.1 Discrete Casesp. 179
4.2.2 Continuous Casesp. 181
4.2.3 The Order Statisticsp. 182
4.2.4 The Convolutionp. 185
4.2.5 The Sampling Distributionp. 187
4.3 Using the Moment Generating Functionp. 190
4.4 A General Approach with Transformationsp. 192
4.4.1 Several Variable Situationsp. 195
4.5 Special Sampling Distributionsp. 206
4.5.1 The Student's t Distributionp. 207
4.5.2 The F Distributionp. 209
4.5.3 The Beta Distributionp. 211
4.6 Special Continuous Multivariate Distributionsp. 212
4.6.1 The Normal Distributionp. 212
4.6.2 The t Distributionp. 218
4.6.3 The F Distributionp. 219
4.7 Importance of Independence in Sampling Distributionsp. 220
4.7.1 Reproductivity of Normal Distributionsp. 220
4.7.2 Reproductivity of Chi-square Distributionsp. 221
4.7.3 The Student's t Distributionp. 223
4.7.4 The F Distributionp. 223
4.8 Selected Review in Matrices and Vectorsp. 224
4.9 Exercises and Complementsp. 227
5 Concepts of Stochastic Convergencep. 241
5.1 Introductionp. 241
5.2 Convergence in Probabilityp. 242
5.3 Convergence in Distributionp. 253
5.3.1 Combination of the Modes of Convergencep. 256
5.3.2 The Central Limit Theoremsp. 257
5.4 Convergence of Chi-square, t, and F Distributionsp. 264
5.4.1 The Chi-square Distributionp. 264
5.4.2 The Student's t Distributionp. 264
5.4.3 The F Distributionp. 265
5.4.4 Convergence of the PDF and Percentage Pointsp. 265
5.5 Exercises and Complementsp. 270
6 Sufficiency, Completeness, and Ancillarityp. 281
6.1 Introductionp. 281
6.2 Sufficiencyp. 282
6.2.1 The Conditional Distribution Approachp. 284
6.2.2 The Neyman Factorization Theoremp. 288
6.3 Minimal Sufficiencyp. 294
6.3.1 The Lehmann-Scheffe Approachp. 295
6.4 Informationp. 300
6.4.1 One-parameter Situationp. 301
6.4.2 Multi-parameter Situationp. 304
6.5 Ancillarityp. 309
6.5.1 The Location, Scale, and Location-Scale Familiesp. 314
6.5.2 Its Role in the Recovery of Informationp. 316
6.6 Completenessp. 318
6.6.1 Complete Sufficient Statisticsp. 320
6.6.2 Basu's Theoremp. 324
6.7 Exercises and Complementsp. 327
7 Point Estimationp. 341
7.1 Introductionp. 341
7.2 Finding Estimatorsp. 342
7.2.1 The Method of Momentsp. 342
7.2.2 The Method of Maximum Likelihoodp. 344
7.3 Criteria to Compare Estimatorsp. 351
7.3.1 Unbiasedness, Variance and Mean Squared Errorp. 351
7.3.2 Best Unbiased and Linear Unbiased Estimatorsp. 354
7.4 Improved Unbiased Estimator via Sufficiencyp. 358
7.4.1 The Rao-Blackwell Theoremp. 358
7.5 Uniformly Minimum Variance Unbiased Estimatorp. 365
7.5.1 The Cramer-Rao Inequality and UMVUEp. 366
7.5.2 The Lehmann-Scheffe Theorems and UMVUEp. 371
7.5.3 A Generalization of the Cramer-Rao Inequalityp. 374
7.5.4 Evaluation of Conditional Expectationsp. 375
7.6 Unbiased Estimation Under Incompletenessp. 377
7.6.1 Does the Rao-Blackwell Theorem Lead to UMVUE?p. 377
7.7 Consistent Estimatorsp. 380
7.8 Exercises and Complementsp. 382
8 Tests of Hypothesesp. 395
8.1 Introductionp. 395
8.2 Error Probabilities and the Power Functionp. 396
8.2.1 The Concept of a Best Testp. 399
8.3 Simple Null Versus Simple Alternative Hypothesesp. 401
8.3.1 Most Powerful Test via the Neyman-Pearson Lemmap. 401
8.3.2 Applications: No Parameters Are Involvedp. 413
8.3.3 Applications: Observations Are Non-IIDp. 416
8.4 One-Sided Composite Alternative Hypothesisp. 417
8.4.1 UMP Test via the Neyman-Pearson Lemmap. 417
8.4.2 Monotone Likelihood Ratio Propertyp. 420
8.4.3 UMP Test via MLR Propertyp. 422
8.5 Simple Null Versus Two-Sided Alternative Hypothesesp. 425
8.5.1 An Example Where UMP Test Does Not Existp. 425
8.5.2 An Example Where UMP Test Existsp. 426
8.5.3 Unbiased and UMP Unbiased Testsp. 428
8.6 Exercises and Complementsp. 429
9 Confidence Interval Estimationp. 441
9.1 Introductionp. 441
9.2 One-Sample Problemsp. 443
9.2.1 Inversion of a Test Procedurep. 444
9.2.2 The Pivotal Approachp. 446
9.2.3 The Interpretation of a Confidence Coefficientp. 451
9.2.4 Ideas of Accuracy Measuresp. 452
9.2.5 Using Confidence Intervals in the Tests of Hypothesisp. 455
9.3 Two-Sample Problemsp. 456
9.3.1 Comparing the Location Parametersp. 456
9.3.2 Comparing the Scale Parametersp. 460
9.4 Multiple Comparisonsp. 463
9.4.1 Estimating a Multivariate Normal Mean Vectorp. 463
9.4.2 Comparing the Meansp. 465
9.4.3 Comparing the Variancesp. 467
9.5 Exercises and Complementsp. 469
10 Bayesian Methodsp. 477
10.1 Introductionp. 477
10.2 Prior and Posterior Distributionsp. 479
10.3 The Conjugate Priorsp. 481
10.4 Point Estimationp. 485
10.5 Credible Intervalsp. 488
10.5.1 Highest Posterior Densityp. 489
10.5.2 Contrasting with the Confidence Intervalsp. 492
10.6 Tests of Hypothesesp. 493
10.7 Examples with Non-Conjugate Priorsp. 494
10.8 Exercises and Complementsp. 497
11 Likelihood Ratio and Other Testsp. 507
11.1 Introductionp. 507
11.2 One-Sample Problemsp. 508
11.2.1 LR Test for the Meanp. 509
11.2.2 LR Test for the Variancep. 512
11.3 Two-Sample Problemsp. 515
11.3.1 Comparing the Meansp. 515
11.3.2 Comparing the Variancesp. 519
11.4 Bivariate Normal Observationsp. 522
11.4.1 Comparing the Means: The Paired Difference t Methodp. 522
11.4.2 LR Test for the Correlation Coefficientp. 525
11.4.3 Tests for the Variancesp. 528
11.5 Exercises and Complementsp. 529
12 Large-Sample Inferencep. 539
12.1 Introductionp. 539
12.2 The Maximum Likelihood Estimationp. 539
12.3 Confidence Intervals and Tests of Hypothesisp. 542
12.3.1 The Distribution-Free Population Meanp. 543
12.3.2 The Binomial Proportionp. 548
12.3.3 The Poisson Meanp. 553
12.4 The Variance Stabilizing Transformationsp. 555
12.4.1 The Binomial Proportionp. 556
12.4.2 The Poisson Meanp. 559
12.4.3 The Correlation Coefficientp. 560
12.5 Exercises and Complementsp. 563
13 Sample Size Determination: Two-Stage Proceduresp. 569
13.1 Introductionp. 569
13.2 The Fixed-Width Confidence Intervalp. 573
13.2.1 Stein's Sampling Methodologyp. 573
13.2.2 Some Interesting Propertiesp. 574
13.3 The Bounded Risk Point Estimationp. 579
13.3.1 The Sampling Methodologyp. 581
13.3.2 Some Interesting Propertiesp. 582
13.4 Exercises and Complementsp. 584
14 Appendixp. 591
14.1 Abbreviations and Notationp. 591
14.2 A Celebration of Statistics: Selected Biographical Notesp. 593
14.3 Selected Statistical Tablesp. 621
14.3.1 The Standard Normal Distribution Functionp. 621
14.3.2 Percentage Points of the Chi-Square Distributionp. 626
14.3.3 Percentage Points of the Student's t Distributionp. 628
14.3.4 Percentage Points of the F Distributionp. 630
Referencesp. 633
Indexp. 649