Available:*
Library | Item Barcode | Call Number | Material Type | Item Category 1 | Status |
---|---|---|---|---|---|
Searching... | 30000010316651 | QA273 M854 2000 | Open Access Book | Book | Searching... |
On Order
Summary
Summary
Priced very competitively compared with other textbooks at this level!
This gracefully organized textbook reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, numerous figures and tables, and computer simulations to develop and illustrate concepts.
Beginning with an introduction to the basic ideas and techniques in probability theory and progressing to more rigorous topics, Probability and Statistical Inference
studies the Helmert transformation for normal distributions and the waiting time between failures for exponential distributions
develops notions of convergence in probability and distribution
spotlights the central limit theorem (CLT) for the sample variance
introduces sampling distributions and the Cornish-Fisher expansions
concentrates on the fundamentals of sufficiency, information, completeness, and ancillarity
explains Basu's Theorem as well as location, scale, and location-scale families of distributions
covers moment estimators, maximum likelihood estimators (MLE), Rao-Blackwellization, and the Cramér-Rao inequality
discusses uniformly minimum variance unbiased estimators (UMVUE) and Lehmann-Scheffé Theorems
focuses on the Neyman-Pearson theory of most powerful (MP) and uniformly most powerful (UMP) tests of hypotheses, as well as confidence intervals
includes the likelihood ratio (LR) tests for the mean, variance, and correlation coefficient
summarizes Bayesian methods
describes the monotone likelihood ratio (MLR) property
handles variance stabilizing transformations
provides a historical context for statistics and statistical discoveries
showcases great statisticians through biographical notes
Employing over 1400 equations to reinforce its subject matter, Probability and Statistical Inference is a groundbreaking text for first-year graduate and upper-level undergraduate courses in probability and statistical inference who have completed a calculus prerequisite, as well as a supplemental text for classes in Advanced Statistical Inference or Decision Theory.
Table of Contents
Preface | p. v |
Acknowledgments | p. xi |
1 Notions of Probability | p. 1 |
1.1 Introduction | p. 1 |
1.2 About Sets | p. 3 |
1.3 Axiomatic Development of Probability | p. 6 |
1.4 The Conditional Probability and Independent Events | p. 9 |
1.4.1 Calculus of Probability | p. 12 |
1.4.2 Bayes's Theorem | p. 14 |
1.4.3 Selected Counting Rules | p. 16 |
1.5 Discrete Random Variables | p. 18 |
1.5.1 Probability Mass and Distribution Functions | p. 19 |
1.6 Continuous Random Variables | p. 23 |
1.6.1 Probability Density and Distribution Functions | p. 23 |
1.6.2 The Median of a Distribution | p. 28 |
1.6.3 Selected Reviews from Mathematics | p. 28 |
1.7 Some Standard Probability Distributions | p. 32 |
1.7.1 Discrete Distributions | p. 33 |
1.7.2 Continuous Distributions | p. 37 |
1.8 Exercises and Complements | p. 50 |
2 Expectations of Functions of Random Variables | p. 65 |
2.1 Introduction | p. 65 |
2.2 Expectation and Variance | p. 65 |
2.2.1 The Bernoulli Distribution | p. 71 |
2.2.2 The Binomial Distribution | p. 72 |
2.2.3 The Poisson Distribution | p. 73 |
2.2.4 The Uniform Distribution | p. 73 |
2.2.5 The Normal Distribution | p. 73 |
2.2.6 The Laplace Distribution | p. 76 |
2.2.7 The Gamma Distribution | p. 76 |
2.3 The Moments and Moment Generating Function | p. 77 |
2.3.1 The Binomial Distribution | p. 80 |
2.3.2 The Poisson Distribution | p. 81 |
2.3.3 The Normal Distribution | p. 82 |
2.3.4 The Gamma Distribution | p. 84 |
2.4 Determination of a Distribution via MGF | p. 86 |
2.5 The Probability Generating Function | p. 88 |
2.6 Exercises and Complements | p. 89 |
3 Multivariate Random Variables | p. 99 |
3.1 Introduction | p. 99 |
3.2 Discrete Distributions | p. 100 |
3.2.1 The Joint, Marginal and Conditional Distributions | p. 101 |
3.2.2 The Multinomial Distribution | p. 103 |
3.3 Continuous Distributions | p. 107 |
3.3.1 The Joint, Marginal and Conditional Distributions | p. 107 |
3.3.2 Three and Higher Dimensions | p. 115 |
3.4 Covariances and Correlation Coefficients | p. 119 |
3.4.1 The Multinomial Case | p. 124 |
3.5 Independence of Random Variables | p. 125 |
3.6 The Bivariate Normal Distribution | p. 131 |
3.7 Correlation Coefficient and Independence | p. 139 |
3.8 The Exponential Family of Distributions | p. 141 |
3.8.1 One-parameter Situation | p. 141 |
3.8.2 Multi-parameter Situation | p. 144 |
3.9 Some Standard Probability Inequalities | p. 145 |
3.9.1 Markov and Bernstein-Chernoff Inequalities | p. 145 |
3.9.2 Tchebysheff's Inequality | p. 148 |
3.9.3 Cauchy-Schwarz and Covariance Inequalities | p. 149 |
3.9.4 Jensen's and Lyapunov's Inequalities | p. 152 |
3.9.5 Holder's Inequality | p. 156 |
3.9.6 Bonferroni Inequality | p. 157 |
3.9.7 Central Absolute Moment Inequality | p. 158 |
3.10 Exercises and Complements | p. 159 |
4 Functions of Random Variables and Sampling Distribution | p. 177 |
4.1 Introduction | p. 177 |
4.2 Using Distribution Functions | p. 179 |
4.2.1 Discrete Cases | p. 179 |
4.2.2 Continuous Cases | p. 181 |
4.2.3 The Order Statistics | p. 182 |
4.2.4 The Convolution | p. 185 |
4.2.5 The Sampling Distribution | p. 187 |
4.3 Using the Moment Generating Function | p. 190 |
4.4 A General Approach with Transformations | p. 192 |
4.4.1 Several Variable Situations | p. 195 |
4.5 Special Sampling Distributions | p. 206 |
4.5.1 The Student's t Distribution | p. 207 |
4.5.2 The F Distribution | p. 209 |
4.5.3 The Beta Distribution | p. 211 |
4.6 Special Continuous Multivariate Distributions | p. 212 |
4.6.1 The Normal Distribution | p. 212 |
4.6.2 The t Distribution | p. 218 |
4.6.3 The F Distribution | p. 219 |
4.7 Importance of Independence in Sampling Distributions | p. 220 |
4.7.1 Reproductivity of Normal Distributions | p. 220 |
4.7.2 Reproductivity of Chi-square Distributions | p. 221 |
4.7.3 The Student's t Distribution | p. 223 |
4.7.4 The F Distribution | p. 223 |
4.8 Selected Review in Matrices and Vectors | p. 224 |
4.9 Exercises and Complements | p. 227 |
5 Concepts of Stochastic Convergence | p. 241 |
5.1 Introduction | p. 241 |
5.2 Convergence in Probability | p. 242 |
5.3 Convergence in Distribution | p. 253 |
5.3.1 Combination of the Modes of Convergence | p. 256 |
5.3.2 The Central Limit Theorems | p. 257 |
5.4 Convergence of Chi-square, t, and F Distributions | p. 264 |
5.4.1 The Chi-square Distribution | p. 264 |
5.4.2 The Student's t Distribution | p. 264 |
5.4.3 The F Distribution | p. 265 |
5.4.4 Convergence of the PDF and Percentage Points | p. 265 |
5.5 Exercises and Complements | p. 270 |
6 Sufficiency, Completeness, and Ancillarity | p. 281 |
6.1 Introduction | p. 281 |
6.2 Sufficiency | p. 282 |
6.2.1 The Conditional Distribution Approach | p. 284 |
6.2.2 The Neyman Factorization Theorem | p. 288 |
6.3 Minimal Sufficiency | p. 294 |
6.3.1 The Lehmann-Scheffe Approach | p. 295 |
6.4 Information | p. 300 |
6.4.1 One-parameter Situation | p. 301 |
6.4.2 Multi-parameter Situation | p. 304 |
6.5 Ancillarity | p. 309 |
6.5.1 The Location, Scale, and Location-Scale Families | p. 314 |
6.5.2 Its Role in the Recovery of Information | p. 316 |
6.6 Completeness | p. 318 |
6.6.1 Complete Sufficient Statistics | p. 320 |
6.6.2 Basu's Theorem | p. 324 |
6.7 Exercises and Complements | p. 327 |
7 Point Estimation | p. 341 |
7.1 Introduction | p. 341 |
7.2 Finding Estimators | p. 342 |
7.2.1 The Method of Moments | p. 342 |
7.2.2 The Method of Maximum Likelihood | p. 344 |
7.3 Criteria to Compare Estimators | p. 351 |
7.3.1 Unbiasedness, Variance and Mean Squared Error | p. 351 |
7.3.2 Best Unbiased and Linear Unbiased Estimators | p. 354 |
7.4 Improved Unbiased Estimator via Sufficiency | p. 358 |
7.4.1 The Rao-Blackwell Theorem | p. 358 |
7.5 Uniformly Minimum Variance Unbiased Estimator | p. 365 |
7.5.1 The Cramer-Rao Inequality and UMVUE | p. 366 |
7.5.2 The Lehmann-Scheffe Theorems and UMVUE | p. 371 |
7.5.3 A Generalization of the Cramer-Rao Inequality | p. 374 |
7.5.4 Evaluation of Conditional Expectations | p. 375 |
7.6 Unbiased Estimation Under Incompleteness | p. 377 |
7.6.1 Does the Rao-Blackwell Theorem Lead to UMVUE? | p. 377 |
7.7 Consistent Estimators | p. 380 |
7.8 Exercises and Complements | p. 382 |
8 Tests of Hypotheses | p. 395 |
8.1 Introduction | p. 395 |
8.2 Error Probabilities and the Power Function | p. 396 |
8.2.1 The Concept of a Best Test | p. 399 |
8.3 Simple Null Versus Simple Alternative Hypotheses | p. 401 |
8.3.1 Most Powerful Test via the Neyman-Pearson Lemma | p. 401 |
8.3.2 Applications: No Parameters Are Involved | p. 413 |
8.3.3 Applications: Observations Are Non-IID | p. 416 |
8.4 One-Sided Composite Alternative Hypothesis | p. 417 |
8.4.1 UMP Test via the Neyman-Pearson Lemma | p. 417 |
8.4.2 Monotone Likelihood Ratio Property | p. 420 |
8.4.3 UMP Test via MLR Property | p. 422 |
8.5 Simple Null Versus Two-Sided Alternative Hypotheses | p. 425 |
8.5.1 An Example Where UMP Test Does Not Exist | p. 425 |
8.5.2 An Example Where UMP Test Exists | p. 426 |
8.5.3 Unbiased and UMP Unbiased Tests | p. 428 |
8.6 Exercises and Complements | p. 429 |
9 Confidence Interval Estimation | p. 441 |
9.1 Introduction | p. 441 |
9.2 One-Sample Problems | p. 443 |
9.2.1 Inversion of a Test Procedure | p. 444 |
9.2.2 The Pivotal Approach | p. 446 |
9.2.3 The Interpretation of a Confidence Coefficient | p. 451 |
9.2.4 Ideas of Accuracy Measures | p. 452 |
9.2.5 Using Confidence Intervals in the Tests of Hypothesis | p. 455 |
9.3 Two-Sample Problems | p. 456 |
9.3.1 Comparing the Location Parameters | p. 456 |
9.3.2 Comparing the Scale Parameters | p. 460 |
9.4 Multiple Comparisons | p. 463 |
9.4.1 Estimating a Multivariate Normal Mean Vector | p. 463 |
9.4.2 Comparing the Means | p. 465 |
9.4.3 Comparing the Variances | p. 467 |
9.5 Exercises and Complements | p. 469 |
10 Bayesian Methods | p. 477 |
10.1 Introduction | p. 477 |
10.2 Prior and Posterior Distributions | p. 479 |
10.3 The Conjugate Priors | p. 481 |
10.4 Point Estimation | p. 485 |
10.5 Credible Intervals | p. 488 |
10.5.1 Highest Posterior Density | p. 489 |
10.5.2 Contrasting with the Confidence Intervals | p. 492 |
10.6 Tests of Hypotheses | p. 493 |
10.7 Examples with Non-Conjugate Priors | p. 494 |
10.8 Exercises and Complements | p. 497 |
11 Likelihood Ratio and Other Tests | p. 507 |
11.1 Introduction | p. 507 |
11.2 One-Sample Problems | p. 508 |
11.2.1 LR Test for the Mean | p. 509 |
11.2.2 LR Test for the Variance | p. 512 |
11.3 Two-Sample Problems | p. 515 |
11.3.1 Comparing the Means | p. 515 |
11.3.2 Comparing the Variances | p. 519 |
11.4 Bivariate Normal Observations | p. 522 |
11.4.1 Comparing the Means: The Paired Difference t Method | p. 522 |
11.4.2 LR Test for the Correlation Coefficient | p. 525 |
11.4.3 Tests for the Variances | p. 528 |
11.5 Exercises and Complements | p. 529 |
12 Large-Sample Inference | p. 539 |
12.1 Introduction | p. 539 |
12.2 The Maximum Likelihood Estimation | p. 539 |
12.3 Confidence Intervals and Tests of Hypothesis | p. 542 |
12.3.1 The Distribution-Free Population Mean | p. 543 |
12.3.2 The Binomial Proportion | p. 548 |
12.3.3 The Poisson Mean | p. 553 |
12.4 The Variance Stabilizing Transformations | p. 555 |
12.4.1 The Binomial Proportion | p. 556 |
12.4.2 The Poisson Mean | p. 559 |
12.4.3 The Correlation Coefficient | p. 560 |
12.5 Exercises and Complements | p. 563 |
13 Sample Size Determination: Two-Stage Procedures | p. 569 |
13.1 Introduction | p. 569 |
13.2 The Fixed-Width Confidence Interval | p. 573 |
13.2.1 Stein's Sampling Methodology | p. 573 |
13.2.2 Some Interesting Properties | p. 574 |
13.3 The Bounded Risk Point Estimation | p. 579 |
13.3.1 The Sampling Methodology | p. 581 |
13.3.2 Some Interesting Properties | p. 582 |
13.4 Exercises and Complements | p. 584 |
14 Appendix | p. 591 |
14.1 Abbreviations and Notation | p. 591 |
14.2 A Celebration of Statistics: Selected Biographical Notes | p. 593 |
14.3 Selected Statistical Tables | p. 621 |
14.3.1 The Standard Normal Distribution Function | p. 621 |
14.3.2 Percentage Points of the Chi-Square Distribution | p. 626 |
14.3.3 Percentage Points of the Student's t Distribution | p. 628 |
14.3.4 Percentage Points of the F Distribution | p. 630 |
References | p. 633 |
Index | p. 649 |