Available:*
Library | Item Barcode | Call Number | Material Type | Item Category 1 | Status |
---|---|---|---|---|---|
Searching... | 30000010129533 | R857.S47 N66 2000 v.1 | Open Access Book | Great Book | Searching... |
Searching... | 30000010129532 | R857.S47 N66 2000 v.2 | Open Access Book | Great Book | Searching... |
On Order
Summary
Summary
For the first time, eleven experts in the fields of signal processing and biomedical engineering have contributed to an edition on the newest theories and applications of fuzzy logic, neural networks, and algorithms in biomedicine. Nonlinear Biomedical Signal Processing, Volume I provides comprehensive coverage of nonlinear signal processing techniques. In the last decade, theoretical developments in the concept of fuzzy logic have led to several new approaches to neural networks. This compilation delivers plenty of real-world examples for a variety of implementations and applications of nonlinear signal processing technologies to biomedical problems. Included here are discussions that combine the various structures of Kohenen, Hopfield, and multiple-layer "designer" networks with other approaches to produce hybrid systems. Comparative analysis is made of methods of genetic, back-propagation, Bayesian, and other learning algorithms.
Topics covered include:
Uncertainty management Analysis of biomedical signals A guided tour of neural networks Application of algorithms to EEG and heart rate variability signals Event detection and sample stratification in genomic sequences Applications of multivariate analysis methods to measure glucose concentration Nonlinear Biomedical Signal Processing, Volume I is a valuable reference tool for medical researchers, medical faculty and advanced graduate students as well as for practicing biomedical engineers. Nonlinear Biomedical Signal Processing, Volume I is an excellent companion to Nonlinear Biomedical Signal Processing, Volume II: Dynamic Analysis and Modeling .Author Notes
About the Editor Metin Akay is currently an assistant professor at Dartmouth College. A noted speaker, editor, and author, Dr. Akay has spent several years conducting research in the areas of fuzzy neural networks and signal processing, wavelet transform, and detection and estimation theory. His biomedical research areas include the autonomic nervous system, maturation, respiratory-related evoked response, noninvasive detection of coronary artery disease, and estimation of cardiac output. Dr. Akay is the founding series editor of the IEEE Press Series on Biomedical Engineering. In 1997 he received the prestigious Early Career Achievement Award from the IEEE Engineering in Medicine and Biology Society (EMBS). He is the program chair of both the annual IEEE EMBS Conference and Summer School for 2001. Dr. Akay has published several papers in the field and authored or coauthored eleven books, including Time Frequency and Wavelets in Biomedical Signal Processing (IEEE Press, 1998) and Nonlinear Biomedical Signal Processing, Volume II: Dynamic Analysis and Modeling (IEEE Press, 2000). He holds two U.S. patents.
Table of Contents
Preface | p. xiii |
List of Contributors | p. xv |
Chapter 1 Uncertainty Management in Medical Applications | p. 1 |
1. Introduction | p. 1 |
2. Imperfect Knowledge | p. 1 |
2.1. Types of Imperfections | p. 1 |
2.1.1. Uncertainties | p. 1 |
2.1.2. Imprecisions | p. 2 |
2.1.3. Incompleteness | p. 2 |
2.1.4. Causes of Imperfect Knowledge | p. 2 |
2.2. Choice of a Method | p. 2 |
3. Fuzzy Set Theory | p. 4 |
3.1. Introduction to Fuzzy Set Theory | p. 4 |
3.2. Main Basic Concepts of Fuzzy Set Theory | p. 5 |
3.2.1. Definitions | p. 5 |
3.2.2. Operations on Fuzzy Sets | p. 6 |
3.2.3. The Zadeh Extension Principle | p. 8 |
3.3. Fuzzy Arithmetic | p. 10 |
3.4. Fuzzy Relations | p. 11 |
4. Possibility Theory | p. 12 |
4.1. Possibility Measures | p. 12 |
4.2. Possibility Distributions | p. 14 |
4.3. Necessity Measures | p. 15 |
4.4. Relative Possibility and Necessity of Fuzzy Sets | p. 17 |
5. Approximate Reasoning | p. 17 |
5.1. Linguistic Variables | p. 17 |
5.2. Fuzzy Propositions | p. 19 |
5.3. Possibility Distribution Associated with a Fuzzy Proposition | p. 19 |
5.4. Fuzzy Implications | p. 21 |
5.5. Fuzzy Inferences | p. 22 |
6. Examples of Applications of Numerical Methods in Biology | p. 23 |
7. Conclusion | p. 24 |
References | p. 25 |
Chapter 2 Applications of Fuzzy Clustering to Biomedical Signal Processing and Dynamic System Identification | p. 27 |
1. Introduction | p. 27 |
1.1. Time Series Prediction and System Identification | p. 28 |
1.2. Fuzzy Clustering | p. 29 |
1.3. Nonstationary Signal Processing Using Unsupervised Fuzzy Clustering | p. 29 |
2. Methods | p. 30 |
2.1. State Recognition and Time Series Prediction Using Unsupervised Fuzzy Clustering | p. 31 |
2.2. Features Extraction and Reduction | p. 32 |
2.2.1. Spectrum Estimation | p. 33 |
2.2.2. Time-Frequency Analysis | p. 33 |
2.3. The Hierarchical Unsupervised Fuzzy Clustering (HUFC) Algorithm | p. 34 |
2.4. The Weighted Unsupervised Optimal Fuzzy Clustering (WUOFC) Algorithm | p. 36 |
2.5. The Weighted Fuzzy K-Mean (WFKM) Algorithm | p. 37 |
2.6. The Fuzzy Hypervolume Cluster Validity Criteria | p. 39 |
2.7. The Dynamic WUOFC Algorithm | p. 40 |
3. Results | p. 40 |
3.1. State Recognition and Events Detection | p. 41 |
3.2. Time Series Prediction | p. 44 |
4. Conclusion and Discussion | p. 48 |
Acknowledgments | p. 51 |
References | p. 51 |
Chapter 3 Neural Networks: A Guided Tour | p. 53 |
1. Some Basic Definitions | p. 53 |
2. Supervised Learning | p. 53 |
2.1. Multilayer Perceptrons and Back-Propagation Learning | p. 54 |
2.2. Radial Basis Function (RBF) Networks | p. 57 |
2.3. Support Vector Machines | p. 58 |
3. Unsupervised Learning | p. 59 |
3.1. Principal Components Analysis | p. 59 |
3.2. Self-Organizing Maps | p. 59 |
3.3. Information-Theoretic Models | p. 60 |
4. Neurodynamic Programming | p. 61 |
5. Temporal Processing Using Feed-Forward Networks | p. 62 |
6. Dynamically Driven Recurrent Networks | p. 63 |
7. Concluding Remarks | p. 67 |
References | p. 67 |
Chapter 4 Neural Networks in Processing and Analysis of Biomedical Signals | p. 69 |
1. Overview and History of Artificial Neural Networks | p. 69 |
1.1. What is an Artificial Neural Network? | p. 70 |
1.2. How Did ANNs Come About? | p. 71 |
1.3. Attributes of ANNs | p. 73 |
1.4. Learning in ANNs | p. 74 |
1.4.1. Supervised Learning | p. 74 |
1.4.2. Unsupervised Learning | p. 75 |
1.5. Hardware and Software Implementation of ANNs | p. 76 |
2. Application of ANNs in Processing Information | p. 77 |
2.1. Processing and Analysis of Biomedical Signals | p. 77 |
2.2. Detection and Classification of Biomedical Signals Using ANNs | p. 77 |
2.3. Detection and Classification of Electrocardiography Signals | p. 78 |
2.4. Detection and Classification of Electromyography Signals | p. 81 |
2.5. Detection and Classification of Electroencephalography Signals | p. 83 |
2.6. Detection and Classification of Electrogastrography Signals | p. 85 |
2.7. Detection and Classification of Respiratory Signals | p. 86 |
2.7.1. Detection of Goiter-Induced Upper Airway Obstruction | p. 86 |
2.7.2. Detection of Pharyngeal Wall Vibration During Sleep | p. 88 |
2.8. ANNs in Biomedical Signal Enhancement | p. 89 |
2.9. ANNs in Biomedical Signal Compression | p. 89 |
Additional Reading and Related Material | p. 91 |
Appendix Back-Propagation Optimization Algorithm | p. 92 |
References | p. 95 |
Chapter 5 Rare Event Detection in Genomic Sequences by Neural Networks and Sample Stratification | p. 98 |
1. Introduction | p. 98 |
2. Sample Stratification | p. 98 |
3. Stratifying Coefficients | p. 99 |
3.1. Derivation of a Modified Back-Propagation Algorithm | p. 100 |
3.2. Approximation of A Posteriori Probabilities | p. 102 |
4. Bootstrap Stratification | p. 104 |
4.1. Bootstrap Procedures | p. 104 |
4.2. Bootstrapping of Rare Events | p. 105 |
4.3. Subsampling of Common Events | p. 105 |
4.4. Aggregating of Multiple Neural Networks | p. 105 |
4.5. The Bootstrap Aggregating Rare Event Neural Networks | p. 105 |
5. Data Set Used in the Experiments | p. 106 |
5.1. Genomic Sequence Data | p. 106 |
5.2. Normally Distributed Data 1, 2 | p. 107 |
5.3. Four-Class Synthetic Data | p. 113 |
6. Experimental Results | p. 113 |
6.1. Experiments with Genomic Sequence Data | p. 113 |
6.2. Experiments with Normally Distributed Data 1 | p. 115 |
6.3. Experiments with Normally Distributed Data 2 | p. 118 |
6.4. Experiments with Four-Class Synthetic Data | p. 118 |
7. Conclusions | p. 120 |
References | p. 120 |
Chapter 6 An Axiomatic Approach to Reformulating Radial Basis Neural Networks | p. 122 |
1. Introduction | p. 122 |
2. Function Approximation Models and RBF Neural Networks | p. 125 |
3. Reformulating Radial Basis Neural Networks | p. 127 |
4. Admissible Generator Functions | p. 129 |
4.1. Linear Generator Functions | p. 129 |
4.2. Exponential Generator Functions | p. 132 |
5. Selecting Generator Functions | p. 133 |
5.1. The Blind Spot | p. 134 |
5.2. Criteria for Selecting Generator Functions | p. 136 |
5.3. Evaluation of Linear and Exponential Generator Functions | p. 137 |
5.3.1. Linear Generator Functions | p. 137 |
5.3.2. Exponential Generator Functions | p. 138 |
6. Learning Algorithms Based on Gradient Descent | p. 141 |
6.1. Batch Learning Algorithms | p. 141 |
6.2. Sequential Learning Algorithms | p. 143 |
7. Generator Functions and Gradient Descent Learning | p. 144 |
8. Experimental Results | p. 146 |
9. Conclusions | p. 154 |
References | p. 155 |
Chapter 7 Soft Learning Vector Quantization and Clustering Algorithms Based on Reformulation | p. 158 |
1. Introduction | p. 158 |
2. Clustering Algorithms | p. 159 |
2.1. Crisp and Fuzzy Partitions | p. 160 |
2.2. Crisp c-Means Algorithm | p. 162 |
2.3. Fuzzy c-Means Algorithm | p. 164 |
2.4. Entropy-Constrained Fuzzy Clustering | p. 165 |
3. Reformulating Fuzzy Clustering | p. 168 |
3.1. Reformulating the Fuzzy c-Means Algorithm | p. 168 |
3.2. Reformulating ECFC Algorithms | p. 170 |
4. Generalized Reformulation Function | p. 171 |
4.1. Update Equations | p. 171 |
4.2. Admissible Reformulation Functions | p. 173 |
4.3. Special Cases | p. 173 |
5. Constructing Reformulation Functions: Generator Functions | p. 174 |
6. Constructing Admissible Generator Functions | p. 175 |
6.1. Increasing Generator Functions | p. 176 |
6.2. Decreasing Generator Functions | p. 176 |
6.3. Duality of Increasing and Decreasing Generator Functions | p. 177 |
7. From Generator Functions to LVQ and Clustering Algorithms | p. 178 |
7.1. Competition and Membership Functions | p. 178 |
7.2. Special Cases: Fuzzy LVQ and Clustering Algorithms | p. 180 |
7.2.1. Linear Generator Functions | p. 180 |
7.2.2. Exponential Generator Functions | p. 181 |
8. Soft LVQ and Clustering Algorithms Based on Nonlinear Generator Functions | p. 182 |
8.1. Implementation of the Algorithms | p. 185 |
9. Initialization of Soft LVQ and Clustering Algorithms | p. 186 |
9.1. A Prototype Splitting Procedure | p. 186 |
9.2. Initialization Schemes | p. 187 |
10. Magnetic Resonance Image Segmentation | p. 188 |
11. Conclusions | p. 194 |
Acknowledgments | p. 195 |
References | p. 196 |
Chapter 8 Metastable Associative Network Models of Neuronal Dynamics Transition During Sleep | p. 198 |
1. Dynamics Transition of Neuronal Activities During Sleep | p. 199 |
2. Physiological Substrate of the Global Neuromodulation | p. 201 |
3. Neural Network Model | p. 201 |
4. Spectral Analysis of Neuronal Activities in Neural Network Model | p. 203 |
5. Dynamics of Neural Network in State Space | p. 204 |
6. Metastability of the Network Attractor | p. 206 |
6.1. Escape Time Distributions in Metastable Equilibrium States | p. 206 |
6.2. Potential Walls Surrounding Metastable States | p. 207 |
7. Possible Mechanisms of the Neuronal Dynamics Transition | p. 210 |
8. Discussion | p. 211 |
Acknowledgments | p. 213 |
References | p. 213 |
Chapter 9 Artificial Neural Networks for Spectroscopic Signal Measurement | p. 216 |
1. Introduction | p. 216 |
2. Methods | p. 217 |
2.1. Partial Least Squares | p. 217 |
2.2. Back-Propagation Networks | p. 218 |
2.3. Radial Basis Function Networks | p. 219 |
2.4. Spectral Data Collection and Preprocessing | p. 220 |
3. Results | p. 221 |
3.1. PLS | p. 221 |
3.2. BP | p. 221 |
3.3. RBF | p. 222 |
4. Discussion | p. 222 |
Acknowledgments | p. 231 |
References | p. 231 |
Chapter 10 Applications of Feed-Forward Neural Networks in the Electrogastrogram | p. 233 |
1. Introduction | p. 233 |
2. Measurements and Preprocessing of the EGG | p. 234 |
2.1. Measurements of the EGG | p. 234 |
2.2. Preprocessing of the EGG Data | p. 235 |
2.2.1. ARMA Modeling Parameters | p. 235 |
2.2.2. Running Power Spectra | p. 236 |
2.2.3. Amplitude (Power) Spectrum | p. 238 |
3. Applications in the EGG | p. 239 |
3.1. Detection and Deletion of Motion Artifacts in EGG Recordings | p. 239 |
3.1.1. Input Data to the NN | p. 239 |
3.1.2. Experimental Results | p. 240 |
3.2. Identification of Gastric Contractions from the EGG | p. 241 |
3.2.1. Experimental Data | p. 241 |
3.2.2. Experimental Results | p. 243 |
3.3. Classification of Normal and Abnormal EGGs | p. 244 |
3.3.1. Experimental Data | p. 246 |
3.3.2. Structure of the NN Classifier and Performance Indexes | p. 246 |
3.3.3. Experimental Results | p. 248 |
3.4. Feature-Based Detection of Delayed Gastric Emptying from the EGG | p. 249 |
3.4.1. Experimental Data | p. 250 |
3.4.2. Experimental Results | p. 251 |
4. Discussion and Conclusions | p. 252 |
References | p. 253 |
Index | p. 257 |
About the Editor | p. 259 |