Skip to:Content
|
Bottom
Cover image for Machine learning : modeling data locally and globally
Title:
Machine learning : modeling data locally and globally
Series:
Advanced topics in science and technology in China
Publication Information:
Hangzhou : Zhejiang University Press/Springer, 2008
Physical Description:
x, 169 p. : ill. ; 24 cm.
ISBN:
9783540794516
Subject Term:
Added Author:

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010202650 Q325.5 M32 2008 Open Access Book Book
Searching...

On Order

Summary

Summary

Machine Learning - Modeling Data Locally and Globally presents a novel and unified theory that tries to seamlessly integrate different algorithms. Specifically, the book distinguishes the inner nature of machine learning algorithms as either "local learning"or "global learning."This theory not only connects previous machine learning methods, or serves as roadmap in various models, but - more importantly - it also motivates a theory that can learn from data both locally and globally. This would help the researchers gain a deeper insight and comprehensive understanding of the techniques in this field. The book reviews current topics,new theories and applications.

Kaizhu Huang was a researcher at the Fujitsu Research and Development Center and is currently a research fellow in the Chinese University of Hong Kong. Haiqin Yang leads the image processing group at HiSilicon Technologies. Irwin King and Michael R. Lyu are professors at the Computer Science and Engineering department of the Chinese University of Hong Kong.


Table of Contents

1 Introductionp. 1
1.1 Learning and Global Modelingp. 1
1.2 Learning and Local Modelingp. 3
1.3 Hybrid Learningp. 5
1.4 Major Contributionsp. 5
1.5 Scopep. 8
1.6 Book Organizationp. 8
Referencesp. 9
2 Global Learning vs. Local Learningp. 13
2.1 Problem Definitionp. 15
2.2 Global Learningp. 16
2.2.1 Generative Learningp. 16
2.2.2 Non-parametric Learningp. 19
2.2.3 The Minimum Error Minimax Probability Machinep. 21
2.3 Local Learningp. 22
2.4 Hybrid Learningp. 23
2.5 Maxi-Min Margin Machinep. 24
Referencesp. 25
3 A General Global Learning Model: MEMPMp. 29
3.1 Marshall and Olkin Theoryp. 30
3.2 Minimum Error Minimax Probability Decision Hyperplanep. 31
3.2.1 Problem Definitionp. 31
3.2.2 Interpretationp. 32
3.2.3 Special Case for Biased Classificationsp. 33
3.2.4 Solving the MEMPM Optimization Problemp. 34
3.2.5 When the Worst-case Bayes Optimal Hyperplane Becomes the True Onep. 39
3.2.6 Geometrical Interpretationp. 42
3.3 Robust Versionp. 45
3.4 Kernelizationp. 46
3.4.1 Kernelization Theory for BMPMp. 47
3.4.2 Notations in Kernelization Theorem of BMPMp. 48
3.4.3 Kernelization Resultsp. 49
3.5 Experimentsp. 50
3.5.1 Model Illustration on a Synthetic Datasetp. 50
3.5.2 Evaluations on Benchmark Datasetsp. 50
3.5.3 Evaluations of BMPM on Heart-disease Datasetp. 55
3.6 How Tight Is the Bound?p. 56
3.7 On the Concavity of MEMPMp. 60
3.8 Limitations and Future Workp. 65
3.9 Summaryp. 66
Referencesp. 67
4 Learning Locally and Globally: Maxi-Min Margin Machinep. 69
4.1 Maxi-Min Margin Machinep. 71
4.1.1 Separable Casep. 71
4.1.2 Connections with Other Modelsp. 74
4.1.3 Nonseparable Casep. 78
4.1.4 Further Connection with Minimum Error Minimax Probability Machinep. 80
4.2 Bound on the Error Ratep. 82
4.3 Reductionp. 84
4.4 Kernelizationp. 85
4.4.1 Foundation of Kernelization for M[superscript 4]p. 85
4.4.2 Kernelization Resultp. 86
4.5 Experimentsp. 88
4.5.1 Evaluations on Three Synthetic Toy Datasetsp. 88
4.5.2 Evaluations on Benchmark Datasetsp. 90
4.6 Discussions and Future Workp. 93
4.7 Summaryp. 93
Referencesp. 94
5 Extension I: BMPM for Imbalanced Learningp. 97
5.1 Introduction to Imbalanced Learningp. 98
5.2 Biased Minimax Probability Machinep. 98
5.3 Learning from Imbalanced Data by Using BMPMp. 100
5.3.1 Four Criteria to Evaluate Learning from Imbalanced Datap. 100
5.3.2 BMPM for Maximizing the Sum of the Accuraciesp. 101
5.3.3 BMPM for ROC Analysisp. 102
5.4 Experimental Resultsp. 102
5.4.1 A Toy Examplep. 102
5.4.2 Evaluations on Real World Imbalanced Datasetsp. 104
5.4.3 Evaluations on Disease Datasetsp. 111
5.5 When the Cost for Each Class Is Knownp. 114
5.6 Summaryp. 115
Referencesp. 115
6 Extension II: A Regression Model from M[superscript 4]p. 119
6.1 A Local Support Vector Regression Modelp. 121
6.1.1 Problem and Model Definitionp. 121
6.1.2 Interpretations and Appealing Propertiesp. 122
6.2 Connection with Support Vector Regressionp. 122
6.3 Link with Maxi-Min Margin Machinep. 124
6.4 Optimization Methodp. 124
6.5 Kernelizationp. 125
6.6 Additional Interpretation on w[superscript T sigma subscript i]wp. 127
6.7 Experimentsp. 128
6.7.1 Evaluations on Synthetic Sinc Datap. 128
6.7.2 Evaluations on Real Financial Datap. 130
6.8 Summaryp. 131
Referencesp. 131
7 Extension III: Variational Margin Settings within Local Datap. 133
7.1 Support Vector Regressionp. 134
7.2 Problem in Margin Settingsp. 136
7.3 General [epsilon]-insensitive Loss Functionp. 136
7.4 Non-fixed Margin Casesp. 139
7.4.1 Momentump. 139
7.4.2 GARCHp. 140
7.5 Experimentsp. 141
7.5.1 Accuracy Metrics and Risk Measurementp. 141
7.5.2 Momentump. 142
7.5.3 GARCHp. 149
7.6 Discussionsp. 155
Referencesp. 158
8 Conclusion and Future Workp. 161
8.1 Review of the Journeyp. 161
8.2 Future Workp. 163
8.2.1 Inside the Proposed Modelsp. 163
8.2.2 Beyond the Proposed Modelsp. 164
Referencesp. 164
Indexp. 167
Go to:Top of Page