Cover image for Complex valued nonlinear adaptive filters : noncircularity, widely linear and neural models
Title:
Complex valued nonlinear adaptive filters : noncircularity, widely linear and neural models
Personal Author:
Series:
Adaptive and learning systems for signal processing, communications, and control
Publication Information:
Chichester, UK : Wiley, 2009
Physical Description:
xviii, 324 p. : ill. ; 25 cm.
ISBN:
9780470066355
Added Author:

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010202971 TA347.C64 M36 2009 Open Access Book Book
Searching...

On Order

Summary

Summary

This book was written in response to the growing demand for a text that provides a unified treatment of linear and nonlinear complex valued adaptive filters, and methods for the processing of general complex signals (circular and noncircular). It brings together adaptive filtering algorithms for feedforward (transversal) and feedback architectures and the recent developments in the statistics of complex variable, under the powerful frameworks of CR (Wirtinger) calculus and augmented complex statistics. This offers a number of theoretical performance gains, which is illustrated on both stochastic gradient algorithms, such as the augmented complex least mean square (ACLMS), and those based on Kalman filters. This work is supported by a number of simulations using synthetic and real world data, including the noncircular and intermittent radar and wind signals.


Author Notes

Danilo Mandic, Department of Electrical and Electronic Engineering, Imperial College London, London
Dr Mandic is currently a Reader in Signal Processing at Imperial College, London. He is an experienced author, having written the book Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability (Wiley, 2001), and more than 150 published journal and conference papers on signal and image processing. His research interests include nonlinear adaptive signal processing, multimodal signal processing and nonlinear dynamics, and he is an Associate Editor for the journals IEEE Transactions on Circuits and Systems and the International Journal of Mathematical Modelling and Algorithms. Dr Mandic is also on the IEEE Technical Committee on Machine Learning for Signal Processing, and he has produced award winning papers and products resulting from his collaboration with industry.

Su-Lee Goh, Royal Dutch Shell plc, Holland
Dr Goh is currently working as a Reservoir Imaging Geophysicist at Shell in Holland. Her research interests include nonlinear signal processing, adaptive filters, complex-valued analysis, and imaging and forecasting. She received her PhD in nonlinear adaptive signal processing from Imperial College, London and is a member of the IEEE and the Society of Exploration Geophysicists.


Table of Contents

Contents
Series Editor's Foreword
About the Authors
Preface
Acknowledgements
1 The Magic of Complex Numbers
1.1 History of Complex Numbers
1.2 History of Mathematical Notation
1.3 Development of Complex Valued Adaptive Signal Processing
2 Why Signal Processing in the Complex Domain?
2.1 Some Examples of Complex Valued Signal Processing
2.2 Modelling in C is Not Only Convenient But Also Natural
2.3 Why Complex Modelling of Real Valued Processes?
2.4 Exploiting the Phase Information
2.5 Other Applications of Complex Domain Processing of Real Valued Signals
2.6 Additional Benefits of Complex Domain Processing
3 Adaptive Filtering Architectures
3.1 Linear and Nonlinear Stochastic Models
3.2 Linear and Nonlinear Adaptive Filtering Architectures
3.3 State Space Representation and Canonical Forms
4 Complex Nonlinear Activation Functions
4.1 Properties of Complex Functions
4.2 Universal Function Approximation
4.3 Nonlinear Activation Functions for Complex Neural Networks
4.4 Generalised Splitting Activation Functions (GSAF
4.5 Summary: Choice of the Complex Activation Function
5 Elements of CR Calculus
5.1 Continuous Complex Functions
5.2 The Cauchy-Riemann Equations
5.3 Generalised Derivatives of Functions of Complex Variable
5.4 CR-derivatives of Cost Functions
6 Complex Valued Adaptive Filters
6.1 Adaptive Filtering Configurations
6.2 The Complex Least Mean Square Algorithm
6.3 Nonlinear Feedforward Complex Adaptive Filters
6.4 Normalisation of Learning Algorithms
6.5 Performance of Feedforward Nonlinear Adaptive Filters
6.6 Summary: Choice of a Nonlinear Adaptive Filter
7 Adaptive Filters with Feedback
7.1 Training of IIR Adaptive Filters
7.2 Nonlinear Adaptive IIR Filters: Recurrent Perceptron
7.3 Training of Recurrent Neural Networks
7.4 Simulation Examples
8 Filters with an Adaptive Stepsize
8.1 Benveniste Type Variable Stepsize Algorithms
8.2 Complex Valued GNGD Algorithms
8.3 Simulation Examples
9 Filters with an Adaptive Amplitude of Nonlinearity
9.1 Dynamical Range Reduction
9.2 FIR Adaptive Filters with an Adaptive Nonlinearity
9.3 Recurrent Neural Networks with Trainable Amplitude of Activation Functions
9.4 Simulation Results
10 Data-reusing Algorithms for Complex Valued Adaptive Filters
10.1 The Data-reusing Complex Valued Least Mean Square (DRCLMS) Algorithm
10.2 Data-reusing Complex Nonlinear Adaptive Filters
10.3 Data-reusing Algorithms for Complex RNNs
11 Complex Mappings and Möbius Transformations
11.1 Matrix Representation of a Complex Number
11.2 The Möbius Transformation
11.3 Activation Functions and Möbius Transformations
11.4 All-pass Systems as Möbius Transformations
11.5 Fractional Delay Filters
12 Augmented Complex Statistics
12.1 Complex Random Variables (CRV
12.2 Complex Circular Random Variables
12.3 Complex Signals
12.4 Second-order Characterisation of Complex Signals
13 Widely Linear Estimation and Augmented CLMS (ACLMS
13.1 Minimum Mean Square Error (MMSE) Estimation in C
13.2 Complex White Noise
13.3 Autoregressive Modelling in C
13.4 The Augmented Complex LMS (ACLMS) Algorithm
13.5 Adaptive Prediction Based on ACLMS
14 Duality Between Complex Valued and Real Valued Filters
14.1 A Dual Channel Real Valued Adaptive Filter
14.2 Duality Between Real and Complex Valued Filters
14.3 Simulations
15 Widely Linear Filters with Feedback
15.1 The Widely Linear ARMA (WL-ARMA) Model
15.2 Widely Linear Adaptive Filters with Feedback
15.3 The Augmented Complex Valued RTRL (ACRTRL) Algorithm
15.4 The Augmented Kalman Filter Algorithm for RNNs
15.5 Augmented Complex Unscented Kalman Filter (ACUKF
15.6 Simulation Examples
16 Collaborative Adaptive Filtering
16.1 Parametric Signal Modality Characterisation
16.2 Standard Hybrid Filtering in R
16.3 Tracking the Linear/Nonlinear Nature of Complex Valued Signals
16.4 Split vs Fully Complex Signal Natures
16.5 Online Assessment of the Nature of Wind Signal
16.6 Collaborative Filters for General Complex Signals
17 Adaptive Filtering Based on EMD
17.1 The Empirical Mode Decomposition Algorithm
17.2 Complex Extensions of Empirical Mode Decomposition
17.3 Addressing the Problem of Uniqueness
17.4 Applications of Complex Extensions of EMD
18 Validation of Complex Representations - Is This Worthwhile?
18.1 Signal Modality Characterisation in R
18.2 Testing for the Validity of Complex Representation
18.3 Quantifying Benefits of Complex Valued Representation
Appendix A Some Distinctive Properties of Calculus in C
Appendix B Proof of Liouville's Theorem
Appendix C Hypercomplex and Clifford Algebras
C.1 Definitions of Algebraic Notions of Group, Ring and Field
C.2 Definition of a Vector Space
C.3 Higher Dimension Algebras
C.4 The Algebra of Quaternions
C.5 Clifford Algebras
Appendix D Real Valued Activation Functions
D.1 Logistic Sigmoid Activation Function
D.2 Hyperbolic Tangent Activation Function
Appendix E Elementary Transcendental Functions (ETF
Appendix F The O Notation and Standard Vector and Matrix Differentiation
F.1 The O Notation
F.2 Standard Vector and Matrix Differentiation
Appendix G Notions From Learning Theory
G.1 Types of Learning
G.2 The Bias-Variance Dilemma
G.3 Recursive and Iterative Gradient Estimation Techniques
G.4 Transformation of Input Data
Appendix H Notions from Approximation Theory
Appendix I Terminology Used in the Field of Neural Networks
Appendix J Complex Valued Pipelined Recurrent Neural Network (CPRNN
J.1 The Complex RTRL Algorithm (CRTRL) for CPRNN
J.1.1 Linear Subsection Within the PRNN
Appendix K GASS Algorithms in R
K.1 Gradient Adaptive Stepsize Algorithms Based on ?E/??
K.2 Variable Stepsize Algorithms Based on ?E/??
Appendix L Derivation of Partial Derivatives from Chapter 8
L.1 Derivation of ?e(k)/?wn(k
L.2 Derivation of ?e∗(k)/??(k − 1
L.3 Derivation of ?w(k)/??(k − 1
Appendix M A Posteriori Learning
M.1 A Posteriori Strategies in Adaptive Learning
Appendix N Notions from Stability Theory
Appendix O Linear Relaxation
O.1 Vector and Matrix Norms
O.2 Relaxation in Linear Systems
O.2.1 Convergence in the Norm or State Space?
Appendix P Contraction Mappings, Fixed Point Iteration and Fractals
P.1 Historical Perspective
P.2 More on Convergence: Modified Contraction Mapping
P.3 Fractals and Mandelbrot Set
References
Index