Cover image for Neural networks and computing : learning algorithms and applications
Title:
Neural networks and computing : learning algorithms and applications
Personal Author:
Series:
Series in electrical and computer engineering ; 7
Publication Information:
London : Imperial College Press, 2007
Physical Description:
1v + 1 CD-ROM
ISBN:
9781860947582
General Note:
Accompanied by compact disc : CP 11984
Added Author:

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010159365 QA76.87 C465 2007 Open Access Book Book
Searching...

On Order

Summary

Summary

This book covers neural networks with special emphasis on advanced learning methodologies and applications. It includes practical issues of weight initializations, stalling of learning, and escape from a local minima, which have not been covered by many existing books in this area. Additionally, the book highlights the important feature selection problem, which baffles many neural networks practitioners because of the difficulties handling large datasets. It also contains several interesting IT, engineering and bioinformatics applications.


Table of Contents

Prefacep. V
1 Introductionp. 1
1.1 Backgroundp. 1
1.2 Neuron Modelp. 2
1.3 Historical Remarksp. 4
1.4 Network architecturep. 6
1.4.1 Supervised Neural Networksp. 6
1.4.1.1 McCulloh and Pitts Modelp. 7
1.4.1.2 The Perceptron Modelp. 11
1.4.1.3 Multi-layer Feedforward Networkp. 14
1.4.1.4 Recurrent Networksp. 15
1.4.2 Unsupervised Neural Networksp. 17
1.5 Modeling and Learning Mechanismp. 19
1.5.1 Determination of Parametersp. 20
1.5.2 Gradient Descent Searching Methodp. 26
Exercisesp. 28
2 Learning Performance and Enhancementp. 31
2.1 Fundamental of Gradient Descent Optimizationp. 32
2.2 Conventional Backpropagation Algorithmp. 35
2.3 Convergence Enhancementp. 42
2.3.1 Extended Backpropagation Algorithmp. 44
2.3.2 Least Squares Based Training Algorithmp. 47
2.3.3 Extended Least Squares Based Algorithmp. 55
2.4 Initialization Considerationp. 59
2.4.1 Weight Initialization Algorithm Ip. 61
2.4.2 Weight Initialization Algorithm IIp. 64
2.4.3 Weight Initialization Algorithm IIIp. 67
2.5 Global Learning Algorithmsp. 69
2.5.1 Simulated Annealing Algorithmp. 70
2.5.2 Alopex Algorithmp. 71
2.5.3 Reactive Tabu Searchp. 72
2.5.4 The NOVEL Algorithmp. 73
2.5.5 The Heuristic Hybrid Global Learning Algorithmp. 74
2.6 Concluding Remarksp. 82
2.6.1 Fast Learning Algorithmsp. 82
2.6.2 Weight Initialization Methodsp. 83
2.6.3 Global Learning Algorithmsp. 84
Appendix 2.1

p. 85

Exercisesp. 87
3 Generalization and Performance Enhancementp. 91
3.1 Cost Function and Performance Surfacep. 93
3.1.1 Maximum Likelihood Estimationp. 94
3.1.2 The Least-Square Cost Functionp. 95
3.2 Higher-Order Statistic Generalizationp. 98
3.2.1 Definitions and Properties of Higher-Order Statisticsp. 99
3.2.2 The Higher-Order Cumulants based Cost Functionp. 101
3.2.3 Property of the Higher-Order Cumulant Cost Functionp. 105
3.2.4 Learning and Generalization Performancep. 108
3.2.4.1 Experiment one: Henon Attractorp. 109
3.2.4.2 Experiment Two: Sunspot time-seriesp. 116
3.3 Regularization for Generalization Enhancementp. 117
3.3.1 Adaptive Regularization Parameter Selection (ARPS) Methodp. 120
3.3.1.1 Stalling Identification Methodp. 121
3.3.1.2 [lambda] Selection Schemesp. 122
3.3.2 Synthetic Function Mappingp. 124
3.4 Concluding Remarksp. 126
3.4.1 Objective function selectionp. 128
3.4.2 Regularization selectionp. 129
Appendix 3.1 Confidence Upper Bound of Approximation Errorp. 131
Appendix 3.2 Proof of the Property of the HOC Cost Functionp. 133
Appendix 3.3 The Derivation of the Sufficient Conditions of the Regularization Parameterp. 136
Exercisesp. 137
4 Basis Function Networks for Classificationp. 139
4.1 Linear Separation and Perceptionsp. 140
4.2 Basis Function Model for Parametric Smoothingp. 142
4.3 Radial Basis Function Networkp. 144
4.3.1 RBF Networks Architecturep. 144
4.3.2 Universal Approximationp. 146
4.3.3 Initialization and Clusteringp. 149
4.3.4 Learning Algorithmsp. 152
4.3.4.1 Linear Weights Optimizationp. 152
4.3.4.2 Gradient Descent Optimizationp. 154
4.3.4.3 Hybrid of Least Squares and Penalized Optimizationp. 155
4.3.5 Regularization Networksp. 157
4.4 Advanced Radial Basis Function Networksp. 159
4.4.1 Support Vector Machinep. 159
4.4.2 Wavelet Networkp. 161
4.4.3 Fuzzy RBF Controllersp. 164
4.4.4 Probabilistic Neural Networksp. 167
4.5 Concluding Remarksp. 169
Exercisesp. 170
5 Self-organizing Mapsp. 173
5.1 Introductionp. 173
5.2 Self-Organizing Mapsp. 177
5.2.1 Learning Algorithmp. 178
5.3 Growing SOMsp. 182
5.3.1 Cell Splitting Gridp. 182
5.3.2 Growing Hierarchical Self-Organizing Quadtree Mapp. 185
5.4 Probabilistic SOMsp. 188
5.4.1 Cellular Probabilistic SOMp. 188
5.4.2 Probabilistic Regularized SOMp. 193
5.5 Clustering of SOMp. 202
5.6 Multi-Layer SOM for Tree-Structured datap. 205
5.6.1 SOM Input Representationp. 207
5.6.2 MLSOM Trainingp. 210
5.6.3 MLSOM visualization and classificationp. 212
Exercisesp. 216
6 Classification and Feature Selectionp. 219
6.1 Introductionp. 219
6.2 Support Vector Machines (SVM)p. 223
6.2.1 Support Vector Machine Visualization (SVMV)p. 224
6.3 Cost Functionp. 229
6.3.1 MSE and MCE Cost Functionsp. 230
6.3.2 Hybrid MCE-MSE Cost Functionp. 232
6.3.3 Implementing MCE-MSEp. 236
6.4 Feature Selectionp. 239
6.4.1 Information Theoryp. 241
6.4.1.1 Mutual Informationp. 241
6.4.1.2 Probability density function (pdf) estimationp. 243
6.4.2 MI Based Forward Feature Selectionp. 245
6.4.2.1 MIFS and MIFS-Up. 247
6.4.2.2 Using quadratic MIp. 248
Exercisesp. 253
7 Engineering Applicationsp. 255
7.1 Electric Load Forecastingp. 255
7.1.1 Nonlinear Autoregressive Integrated Neural Network Modelp. 257
7.1.2 Case Studiesp. 261
7.2 Content-based Image Retrieval Using SOMp. 266
7.2.1 GHSOQM Based CBIR Systemsp. 267
7.2.1.1 Overall Architecture of GHSOQM-Based CBIR Systemp. 267
7.2.1.2 Image Segmentation, Feature Extraction and Region-Based Feature Matricesp. 268
7.2.1.3 Image Distancep. 269
7.2.1.4 GHSOQM and Relevance Feedback in the CBIR Systemp. 270
7.2.2 Performance Evaluationp. 274
7.3 Feature Selection for cDNA Microarrayp. 278
7.3.1 MI Based Forward Feature Selection Schemep. 279
7.3.2 The Supervised Grid Based Redundancy Eliminationp. 280
7.3.3 The Forward Gene Selection Process Using MIIO and MISFp. 281
7.3.4 Resultsp. 282
7.3.4.1 Prostate Cancer Classification Datasetp. 284
7.3.4.2 Subtype of ALL Classification Datasetp. 288
7.3.5 Remarksp. 294
Bibliographyp. 291
Indexp. 305