Skip to:Content
|
Bottom
Cover image for Principles of artificial neural networks
Title:
Principles of artificial neural networks
Personal Author:
Series:
Advanced series on circuits and systems ; 6
Edition:
2nd ed.
Publication Information:
Singapore : World Scientific, 2007
ISBN:
9789812706249

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010167792 QA76.87 G72 2007 Open Access Book Book
Searching...
Searching...
30000010215441 QA76.87 G72 2007 Open Access Book Book
Searching...

On Order

Table of Contents

Acknowledgmentsp. vii
Preface to the First Editionp. ix
Preface to the Second Editionp. xi
Chapter 1 Introduction and Role of Artificial Neural Networksp. 1
Chapter 2 Fundamentals of Biological Neural Networksp. 5
Chapter 3 Basic Principles of ANNs and Their Early Structuresp. 9
3.1 Basic Principles of ANN Designp. 9
3.2 Basic Network Structuresp. 10
3.3 The Perceptron's Input-Output Principlesp. 11
3.4 The Adaline (ALC)p. 12
Chapter 4 The Perceptronp. 17
4.1 The Basic Structurep. 17
4.2 The Single-Layer Representation Problemp. 22
4.3 The Limitations of the Single-Layer Perceptronp. 23
4.4 Many-Layer Perceptronsp. 24
4.A Perceptron Case Study: Identifying Autoregressive Parameters of a Signal (AR Time Series Identification)p. 25
Chapter 5 The Madalinep. 37
5.1 Madaline Trainingp. 37
5.A Madaline Case Study: Character Recognitionp. 39
Chapter 6 Back Propagationp. 59
6.1 The Back Propagation Learning Procedurep. 59
6.2 Derivation of the BP Algorithmp. 59
6.3 Modified BP Algorithmsp. 63
6.A Back Propagation Case Study: Character Recognitionp. 65
6.B Back Propagation Case Study: The Exclusive-OR (XOR) Problem (2-Layer BP)p. 76
6.C Back Propagation Case Study: The XOR Problem - 3 Layer BP Networkp. 94
Chapter 7 Hopfield Networksp. 113
7.1 Introductionp. 113
7.2 Binary Hopfield Networksp. 113
7.3 Setting of Weights in Hopfield Nets - Bidirectional Associative Memory (BAM) Principlep. 114
7.4 Walsh Functionsp. 117
7.5 Network Stabilityp. 118
7.6 Summary of the Procedure for Implementing the Hopfield Networkp. 121
7.7 Continuous Hopfield Modelsp. 122
7.8 The Continuous Energy (Lyapunov) Functionp. 123
7.A Hopfield Network Case Study: Character Recognitionp. 125
7.B Hopfield Network Case Study: Traveling Salesman Problemp. 136
Chapter 8 Counter Propagationp. 161
8.1 Introductionp. 161
8.2 Kohonen Self-Organizing Map (SOM) Layerp. 161
8.3 Grossberg Layerp. 162
8.4 Training of the Kohonen Layerp. 162
8.5 Training of Grossberg Layersp. 165
8.6 The Combined Counter Propagation Networkp. 165
8.A Counter Propagation Network Case Study: Character Recognitionp. 166
Chapter 9 Adaptive Resonance Theoryp. 179
9.1 Motivationp. 179
9.2 The ART Network Structurep. 179
9.3 Setting-Up of the ART Networkp. 183
9.4 Network Operationp. 184
9.5 Properties of ARTp. 186
9.6 Discussion and General Comments on ART-I and ART-IIp. 186
9.A ART-I Network Case Study: Character Recognitionp. 187
9.B ART-I Case Study: Speech Recognitionp. 201
Chapter 10 The Cognitron and the Neocognitronp. 209
10.1 Background of the Cognitronp. 209
10.2 The Basic Principles of the Cognitronp. 209
10.3 Network Operationp. 209
10.4 Cognitron's Network Trainingp. 211
10.5 The Neocognitronp. 213
Chapter 11 Statistical Trainingp. 215
11.1 Fundamental Philosophyp. 215
11.2 Annealing Methodsp. 216
11.3 Simulated Annealing by Boltzman Training of Weightsp. 216
11.4 Stochastic Determination of Magnitude of Weight Changep. 217
11.5 Temperature-Equivalent Settingp. 217
11.6 Cauchy Training of Neural Networkp. 217
11.A Statistical Training Case Study - A Stochastic Hopfield Network for Character Recognitionp. 219
ll.B Statistical Training Case Study: Identifying AR Signal Parameters with a Stochastic Perceptron Modelp. 222
Chapter 12 Recurrent (Time Cycling) Back Propagation Networksp. 233
12.1 Recurrent/Discrete Time Networksp. 233
12.2 Fully Recurrent Networksp. 234
12.3 Continuously Recurrent Back Propagation Networksp. 235
12.A Recurrent Back Propagation Case Study: Character Recognitionp. 236
Chapter 13 Large Scale Memory Storage and Retrieval (LAMSTAR) Networkp. 249
13.1 Basic Principles of the LAMSTAR Neural Networkp. 249
13.2 Detailed Outline of the LAMSTAR Networkp. 251
13.3 Forgetting Featurep. 257
13.4 Training vs. Operational Runsp. 258
13.5 Advanced Data Analysis Capabilitiesp. 259
13.6 Correlation, Interpolation, Extrapolation and Innovation-Detectionp. 261
13.7 Concluding Comments and Discussion of Applicabilityp. 262
13.A LAMSTAR Network Case Study: Character Recognitionp. 265
13.B Application to Medical Diagnosis Problemsp. 280
Problemsp. 285
Referencesp. 291
Author Indexp. 299
Subject Indexp. 301
Go to:Top of Page