Available:*
Library | Item Barcode | Call Number | Material Type | Item Category 1 | Status |
---|---|---|---|---|---|
Searching... | 30000010127678 | QA276.9 A38 2005 | Open Access Book | Book | Searching... |
On Order
Summary
Summary
The process of inductive inference -- to infer general laws and principles from particular instances -- is the basis of statistical modeling, pattern recognition, and machine learning. The Minimum Descriptive Length (MDL) principle, a powerful method of inductive inference, holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data -- that the more we are able to compress the data, the more we learn about the regularities underlying the data. Advances in Minimum Description Length is a sourcebook that will introduce the scientific community to the foundations of MDL, recent theoretical advances, and practical applications.The book begins with an extensive tutorial on MDL, covering its theoretical underpinnings, practical implications as well as its various interpretations, and its underlying philosophy. The tutorial includes a brief history of MDL -- from its roots in the notion of Kolmogorov complexity to the beginning of MDL proper. The book then presents recent theoretical advances, introducing modern MDL methods in a way that is accessible to readers from many different scientific fields. The book concludes with examples of how to apply MDL in research settings that range from bioinformatics and machine learning to psychology.
Author Notes
David Benatar, Ph.D., teaches in the Philosophy Department at the University of Cape Town, South Africa. His research interests are in moral philosophy and related areas.
Table of Contents
Series Foreword | p. vii |
Preface | p. ix |
I Introductory Chapters | |
1 Introducing the Minimum Description Length Principle | p. 3 |
2 Minimum Description Length Tutorial | p. 23 |
3 MDL, Bayesian Inference, and the Geometry of the Space of Probability Distributions | p. 81 |
4 Hypothesis Testing for Poisson vs. Geometric Distributions Using Stochastic Complexity | p. 99 |
5 Applications of MDL to Selected Families of Models | p. 125 |
6 Algorithmic Statistics and Kolmogorov's Structure Functions | p. 151 |
II Theoretical Advances | |
7 Exact Minimax Predictive Density Estimation and MDL | p. 177 |
8 The Contribution of Parameters to Stochastic Complexity | p. 195 |
9 Extended Stochastic Complexity and Its Applications to Learning | p. 215 |
10 Kolmogorov's Structure Function in MDL Theory and Lossy Data Compression | p. 245 |
III Practical Applications | |
11 Minimum Message Length and Generalized Bayesian Nets with Asymmetric Languages | p. 265 |
12 Simultaneous Clustering and Subset Selection via MDL | p. 295 |
13 An MDL Framework for Data Clustering | p. 323 |
14 Minimum Description Length and Psychological Clustering Models | p. 355 |
15 A Minimum Description Length Principle for Perception | p. 385 |
16 Minimum Description Length and Cognitive Modeling | p. 411 |
Index | p. 435 |