Title:
Principles of artificial neural networks
Personal Author:
Series:
Advanced series on circuits and systems ; 6
Edition:
2nd ed.
Publication Information:
Singapore : World Scientific, 2007
ISBN:
9789812706249
Subject Term:
Available:*
Library | Item Barcode | Call Number | Material Type | Item Category 1 | Status |
---|---|---|---|---|---|
Searching... | 30000010167792 | QA76.87 G72 2007 | Open Access Book | Book | Searching... |
Searching... | 30000010215441 | QA76.87 G72 2007 | Open Access Book | Book | Searching... |
On Order
Table of Contents
Acknowledgments | p. vii |
Preface to the First Edition | p. ix |
Preface to the Second Edition | p. xi |
Chapter 1 Introduction and Role of Artificial Neural Networks | p. 1 |
Chapter 2 Fundamentals of Biological Neural Networks | p. 5 |
Chapter 3 Basic Principles of ANNs and Their Early Structures | p. 9 |
3.1 Basic Principles of ANN Design | p. 9 |
3.2 Basic Network Structures | p. 10 |
3.3 The Perceptron's Input-Output Principles | p. 11 |
3.4 The Adaline (ALC) | p. 12 |
Chapter 4 The Perceptron | p. 17 |
4.1 The Basic Structure | p. 17 |
4.2 The Single-Layer Representation Problem | p. 22 |
4.3 The Limitations of the Single-Layer Perceptron | p. 23 |
4.4 Many-Layer Perceptrons | p. 24 |
4.A Perceptron Case Study: Identifying Autoregressive Parameters of a Signal (AR Time Series Identification) | p. 25 |
Chapter 5 The Madaline | p. 37 |
5.1 Madaline Training | p. 37 |
5.A Madaline Case Study: Character Recognition | p. 39 |
Chapter 6 Back Propagation | p. 59 |
6.1 The Back Propagation Learning Procedure | p. 59 |
6.2 Derivation of the BP Algorithm | p. 59 |
6.3 Modified BP Algorithms | p. 63 |
6.A Back Propagation Case Study: Character Recognition | p. 65 |
6.B Back Propagation Case Study: The Exclusive-OR (XOR) Problem (2-Layer BP) | p. 76 |
6.C Back Propagation Case Study: The XOR Problem - 3 Layer BP Network | p. 94 |
Chapter 7 Hopfield Networks | p. 113 |
7.1 Introduction | p. 113 |
7.2 Binary Hopfield Networks | p. 113 |
7.3 Setting of Weights in Hopfield Nets - Bidirectional Associative Memory (BAM) Principle | p. 114 |
7.4 Walsh Functions | p. 117 |
7.5 Network Stability | p. 118 |
7.6 Summary of the Procedure for Implementing the Hopfield Network | p. 121 |
7.7 Continuous Hopfield Models | p. 122 |
7.8 The Continuous Energy (Lyapunov) Function | p. 123 |
7.A Hopfield Network Case Study: Character Recognition | p. 125 |
7.B Hopfield Network Case Study: Traveling Salesman Problem | p. 136 |
Chapter 8 Counter Propagation | p. 161 |
8.1 Introduction | p. 161 |
8.2 Kohonen Self-Organizing Map (SOM) Layer | p. 161 |
8.3 Grossberg Layer | p. 162 |
8.4 Training of the Kohonen Layer | p. 162 |
8.5 Training of Grossberg Layers | p. 165 |
8.6 The Combined Counter Propagation Network | p. 165 |
8.A Counter Propagation Network Case Study: Character Recognition | p. 166 |
Chapter 9 Adaptive Resonance Theory | p. 179 |
9.1 Motivation | p. 179 |
9.2 The ART Network Structure | p. 179 |
9.3 Setting-Up of the ART Network | p. 183 |
9.4 Network Operation | p. 184 |
9.5 Properties of ART | p. 186 |
9.6 Discussion and General Comments on ART-I and ART-II | p. 186 |
9.A ART-I Network Case Study: Character Recognition | p. 187 |
9.B ART-I Case Study: Speech Recognition | p. 201 |
Chapter 10 The Cognitron and the Neocognitron | p. 209 |
10.1 Background of the Cognitron | p. 209 |
10.2 The Basic Principles of the Cognitron | p. 209 |
10.3 Network Operation | p. 209 |
10.4 Cognitron's Network Training | p. 211 |
10.5 The Neocognitron | p. 213 |
Chapter 11 Statistical Training | p. 215 |
11.1 Fundamental Philosophy | p. 215 |
11.2 Annealing Methods | p. 216 |
11.3 Simulated Annealing by Boltzman Training of Weights | p. 216 |
11.4 Stochastic Determination of Magnitude of Weight Change | p. 217 |
11.5 Temperature-Equivalent Setting | p. 217 |
11.6 Cauchy Training of Neural Network | p. 217 |
11.A Statistical Training Case Study - A Stochastic Hopfield Network for Character Recognition | p. 219 |
ll.B Statistical Training Case Study: Identifying AR Signal Parameters with a Stochastic Perceptron Model | p. 222 |
Chapter 12 Recurrent (Time Cycling) Back Propagation Networks | p. 233 |
12.1 Recurrent/Discrete Time Networks | p. 233 |
12.2 Fully Recurrent Networks | p. 234 |
12.3 Continuously Recurrent Back Propagation Networks | p. 235 |
12.A Recurrent Back Propagation Case Study: Character Recognition | p. 236 |
Chapter 13 Large Scale Memory Storage and Retrieval (LAMSTAR) Network | p. 249 |
13.1 Basic Principles of the LAMSTAR Neural Network | p. 249 |
13.2 Detailed Outline of the LAMSTAR Network | p. 251 |
13.3 Forgetting Feature | p. 257 |
13.4 Training vs. Operational Runs | p. 258 |
13.5 Advanced Data Analysis Capabilities | p. 259 |
13.6 Correlation, Interpolation, Extrapolation and Innovation-Detection | p. 261 |
13.7 Concluding Comments and Discussion of Applicability | p. 262 |
13.A LAMSTAR Network Case Study: Character Recognition | p. 265 |
13.B Application to Medical Diagnosis Problems | p. 280 |
Problems | p. 285 |
References | p. 291 |
Author Index | p. 299 |
Subject Index | p. 301 |