Cover image for Advances in minimum description length : theory and applications
Title:
Advances in minimum description length : theory and applications
Series:
Neural information processing series
Publication Information:
Cambridge, MA : MIT Press, 2005
ISBN:
9780262072625

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010127678 QA276.9 A38 2005 Open Access Book Book
Searching...

On Order

Summary

Summary

The process of inductive inference -- to infer general laws and principles from particular instances -- is the basis of statistical modeling, pattern recognition, and machine learning. The Minimum Descriptive Length (MDL) principle, a powerful method of inductive inference, holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data -- that the more we are able to compress the data, the more we learn about the regularities underlying the data. Advances in Minimum Description Length is a sourcebook that will introduce the scientific community to the foundations of MDL, recent theoretical advances, and practical applications.The book begins with an extensive tutorial on MDL, covering its theoretical underpinnings, practical implications as well as its various interpretations, and its underlying philosophy. The tutorial includes a brief history of MDL -- from its roots in the notion of Kolmogorov complexity to the beginning of MDL proper. The book then presents recent theoretical advances, introducing modern MDL methods in a way that is accessible to readers from many different scientific fields. The book concludes with examples of how to apply MDL in research settings that range from bioinformatics and machine learning to psychology.


Author Notes

David Benatar, Ph.D., teaches in the Philosophy Department at the University of Cape Town, South Africa. His research interests are in moral philosophy and related areas.


Table of Contents

Peter D. GrunwaldPeter D. GrunwaldVijay BalasubramanianAaron D. LantermanAndrew J. Hanson and Philip Chi-Wing FuPaul VitanyiFeng Liang and Andrew BarronDean P. Foster and Robert A. StineKenji YamanishiJorma Rissanen and Ioan TabusJoshua W. Comley and David L. DoweRebecka Jornsten and Bin YuPetri Kontkanen and Petri Myllymaki and Wray Buntine and Jorma Rissanen and Henry TirriMichael D. Lee and Daniel J. NavarroNick ChaterYong Su and In Jae Myung and Mark A. Pitt
Series Forewordp. vii
Prefacep. ix
I Introductory Chapters
1 Introducing the Minimum Description Length Principlep. 3
2 Minimum Description Length Tutorialp. 23
3 MDL, Bayesian Inference, and the Geometry of the Space of Probability Distributionsp. 81
4 Hypothesis Testing for Poisson vs. Geometric Distributions Using Stochastic Complexityp. 99
5 Applications of MDL to Selected Families of Modelsp. 125
6 Algorithmic Statistics and Kolmogorov's Structure Functionsp. 151
II Theoretical Advances
7 Exact Minimax Predictive Density Estimation and MDLp. 177
8 The Contribution of Parameters to Stochastic Complexityp. 195
9 Extended Stochastic Complexity and Its Applications to Learningp. 215
10 Kolmogorov's Structure Function in MDL Theory and Lossy Data Compressionp. 245
III Practical Applications
11 Minimum Message Length and Generalized Bayesian Nets with Asymmetric Languagesp. 265
12 Simultaneous Clustering and Subset Selection via MDLp. 295
13 An MDL Framework for Data Clusteringp. 323
14 Minimum Description Length and Psychological Clustering Modelsp. 355
15 A Minimum Description Length Principle for Perceptionp. 385
16 Minimum Description Length and Cognitive Modelingp. 411
Indexp. 435