Available:*
Library | Item Barcode | Call Number | Material Type | Item Category 1 | Status |
---|---|---|---|---|---|
Searching... | 30000010193005 | TK5102.9 C366 2008 | Open Access Book | Book | Searching... |
On Order
Summary
Summary
New Bayesian approach helps you solve tough problems in signal processing with ease
Signal processing is based on this fundamental concept-the extraction of critical information from noisy, uncertain data. Most techniques rely on underlying Gaussian assumptions for a solution, but what happens when these assumptions are erroneous? Bayesian techniques circumvent this limitation by offering a completely different approach that can easily incorporate non-Gaussian and nonlinear processes along with all of the usual methods currently available.
This text enables readers to fully exploit the many advantages of the "Bayesian approach" to model-based signal processing. It clearly demonstrates the features of this powerful approach compared to the pure statistical methods found in other texts. Readers will discover how easily and effectively the Bayesian approach, coupled with the hierarchy of physics-based models developed throughout, can be applied to signal processing problems that previously seemed unsolvable.
Bayesian Signal Processing features the latest generation of processors (particle filters) that have been enabled by the advent of high-speed/high-throughput computers. The Bayesian approach is uniformly developed in this book's algorithms, examples, applications, and case studies. Throughout this book, the emphasis is on nonlinear/non-Gaussian problems; however, some classical techniques (e.g. Kalman filters, unscented Kalman filters, Gaussian sums, grid-based filters, et al) are included to enable readers familiar with those methods to draw parallels between the two approaches.
Special features include:
Unified Bayesian treatment starting from the basics (Bayes's rule) to the more advanced (Monte Carlo sampling), evolving to the next-generation techniques (sequential Monte Carlo sampling)Incorporates "classical" Kalman filtering for linear, linearized, and nonlinear systems; "modern" unscented Kalman filters; and the "next-generation" Bayesian particle filters
Examples illustrate how theory can be applied directly to a variety of processing problems
Case studies demonstrate how the Bayesian approach solves real-world problems in practice
MATLAB notes at the end of each chapter help readers solve complex problems using readily available software commands and point out software packages available
Problem sets test readers' knowledge and help them put their new skills into practice
The basic Bayesian approach is emphasized throughout this text in order to enable the processor to rethink the approach to formulating and solving signal processing problems from the Bayesian perspective. This text brings readers from the classical methods of model-based signal processing to the next generation of processors that will clearly dominate the future of signal processing for years to come. With its many illustrations demonstrating the applicability of the Bayesian approach to real-world problems in signal processing, this text is essential for all students, scientists, and engineers who investigate and apply signal processing to their everyday problems.
Author Notes
James V. Candy, PhD, is Chief Scientist for Engineering, founder, and former director of the Center for Advanced Signal Image Sciences at the Lawrence Livermore National Laboratory. Dr. Candy is also an Adjunct Full Professor at the University of California, Santa Barbara, a Fellow of the IEEE, and a Fellow of the Acoustical Society of America. Dr. Candy has published more than 225 journal articles, book chapters, and technical reports. He is also the author of Signal Processing: Model-Based Approach, Signal Processing: A Modern Approach, and Model-Based Signal Processing (Wiley). Dr. Candy was awarded the IEEE Distinguished Technical Achievement Award for his development of model-based signal processing and the Acoustical Society of America Helmholtz-Rayleigh Interdisciplinary Silver Medal for his contributions to acoustical signal processing and underwater acoustics.
Table of Contents
Preface | p. xiii |
References to the Preface | p. xix |
Acknowledgments | p. xxiii |
1 Introduction | p. 1 |
1.1 Introduction | p. 1 |
1.2 Bayesian Signal Processing | p. 1 |
1.3 Simulation-Based Approach to Bayesian Processing | p. 4 |
1.4 Bayesian Model-Based Signal Processing | p. 8 |
1.5 Notation and Terminology | p. 12 |
References | p. 14 |
Problems | p. 15 |
2 Bayesian Estimation | p. 19 |
2.1 Introduction | p. 19 |
2.2 Batch Bayesian Estimation | p. 19 |
2.3 Batch Maximum Likelihood Estimation | p. 22 |
2.3.1 Expectation-Maximization Approach to Maximum Likelihood | p. 25 |
2.3.2 EM for Exponential Family of Distributions | p. 30 |
2.4 Batch Minimum Variance Estimation | p. 33 |
2.5 Sequential Bayesian Estimation | p. 36 |
2.5.1 Joint Posterior Estimation | p. 39 |
2.5.2 Filtering Posterior Estimation | p. 41 |
2.6 Summary | p. 43 |
References | p. 44 |
Problems | p. 45 |
3 Simulation-Based Bayesian Methods | p. 51 |
3.1 Introduction | p. 51 |
3.2 Probability Density Function Estimation | p. 53 |
3.3 Sampling Theory | p. 56 |
3.3.1 Uniform Sampling Method | p. 58 |
3.3.2 Rejection Sampling Method | p. 62 |
3.4 Monte Carlo Approach | p. 64 |
3.4.1 Markov Chains | p. 70 |
3.4.2 Metropolis-Hastings Sampling | p. 71 |
3.4.3 Random Walk Metropolis-Hastings Sampling | p. 73 |
3.4.4 Gibbs Sampling | p. 75 |
3.4.5 Slice Sampling | p. 78 |
3.5 Importance Sampling | p. 81 |
3.6 Sequential Importance Sampling | p. 84 |
3.7 Summary | p. 87 |
References | p. 87 |
Problems | p. 90 |
4 State-Space Models for Bayesian Processing | p. 95 |
4.1 Introduction | p. 95 |
4.2 Continuous-Time State-Space Models | p. 96 |
4.3 Sampled-Data State-Space Models | p. 100 |
4.4 Discrete-Time State-Space Models | p. 104 |
4.4.1 Discrete Systems Theory | p. 107 |
4.5 Gauss-Markov State-Space Models | p. 112 |
4.5.1 Continuous-Time/Sampled-Data Gauss-Markov Models | p. 112 |
4.5.2 Discrete-Time Gauss-Markov Models | p. 114 |
4.6 Innovations Model | p. 120 |
4.7 State-Space Model Structures | p. 121 |
4.7.1 Time Series Models | p. 121 |
4.7.2 State-Space and Time Series Equivalence Models | p. 129 |
4.8 Nonlinear (Approximate) Gauss-Markov State-Space Models | p. 135 |
4.9 Summary | p. 139 |
References | p. 140 |
Problems | p. 141 |
5 Classical Bayesian State-Space Processors | p. 147 |
5.1 Introduction | p. 147 |
5.2 Bayesian Approach to the State-Space | p. 147 |
5.3 Linear Bayesian Processor (Linear Kalman Filter) | p. 150 |
5.4 Linearized Bayesian Processor (Linearized Kalman Filter) | p. 160 |
5.5 Extended Bayesian Processor (Extended Kalman Filter) | p. 167 |
5.6 Iterated-Extended Bayesian Processor (Iterated-Extended Kalman Filter) | p. 174 |
5.7 Practical Aspects of Classical Bayesian Processors | p. 182 |
5.8 Case Study: RLC Circuit Problem | p. 186 |
5.9 Summary | p. 191 |
References | p. 191 |
Problems | p. 193 |
6 Modern Bayesian State-Space Processors | p. 197 |
6.1 Introduction | p. 197 |
6.2 Sigma-Point (Unscented) Transformations | p. 198 |
6.2.1 Statistical Linearization | p. 198 |
6.2.2 Sigma-Point Approach | p. 200 |
6.2.3 SPT for Gaussian Prior Distributions | p. 205 |
6.3 Sigma-Point Bayesian Processor (Unscented Kalman Filter) | p. 209 |
6.3.1 Extensions of the Sigma-Point Processor | p. 218 |
6.4 Quadrature Bayesian Processors | p. 218 |
6.5 Gaussian Sum (Mixture) Bayesian Processors | p. 220 |
6.6 Case Study: 2D-Tracking Problem | p. 224 |
6.7 Summary | p. 230 |
References | p. 231 |
Problems | p. 233 |
7 Particle-Based Bayesian State-Space Processors | p. 237 |
7.1 Introduction | p. 237 |
7.2 Bayesian State-Space Particle Filters | p. 237 |
7.3 Importance Proposal Distributions | p. 242 |
7.3.1 Minimum Variance Importance Distribution | p. 242 |
7.3.2 Transition Prior Importance Distribution | p. 245 |
7.4 Resampling | p. 246 |
7.4.1 Multinomial Resampling | p. 249 |
7.4.2 Systematic Resampling | p. 251 |
7.4.3 Residual Resampling | p. 251 |
7.5 State-Space Particle Filtering Techniques | p. 252 |
7.5.1 Bootstrap Particle Filter | p. 253 |
7.5.2 Auxiliary Particle Filter | p. 261 |
7.5.3 Regularized Particle Filter | p. 264 |
7.5.4 MCMC Particle Filter | p. 266 |
7.5.5 Linearized Particle Filter | p. 270 |
7.6 Practical Aspects of Particle Filter Design | p. 272 |
7.6.1 Posterior Probability Validation | p. 273 |
7.6.2 Model Validation Testing | p. 277 |
7.7 Case Study: Population Growth Problem | p. 285 |
7.8 Summary | p. 289 |
References | p. 290 |
Problems | p. 293 |
8 Joint Bayesian State/Parametric Processors | p. 299 |
8.1 Introduction | p. 299 |
8.2 Bayesian Approach to Joint State/Parameter Estimation | p. 300 |
8.3 Classical/Modern Joint Bayesian State/Parametric Processors | p. 302 |
8.3.1 Classical Joint Bayesian Processor | p. 303 |
8.3.2 Modern Joint Bayesian Processor | p. 311 |
8.4 Particle-Based Joint Bayesian State/Parametric Processors | p. 313 |
8.5 Case Study: Random Target Tracking Using a Synthetic Aperture Towed Array | p. 318 |
8.6 Summary | p. 327 |
References | p. 328 |
Problems | p. 330 |
9 Discrete Hidden Markov Model Bayesian Processors | p. 335 |
9.1 Introduction | p. 335 |
9.2 Hidden Markov Models | p. 335 |
9.2.1 Discrete-Time Markov Chains | p. 336 |
9.2.2 Hidden Markov Chains | p. 337 |
9.3 Properties of the Hidden Markov Model | p. 339 |
9.4 HMM Observation Probability: Evaluation Problem | p. 341 |
9.5 State Estimation in HMM: The Viterbi Technique | p. 345 |
9.5.1 Individual Hidden State Estimation | p. 345 |
9.5.2 Entire Hidden State Sequence Estimation | p. 347 |
9.6 Parameter Estimation in HMM: The EM/Baum-Welch Technique | p. 350 |
9.6.1 Parameter Estimation with State Sequence Known | p. 352 |
9.6.2 Parameter Estimation with State Sequence Unknown | p. 354 |
9.7 Case Study: Time-Reversal Decoding | p. 357 |
9.8 Summary | p. 362 |
References | p. 363 |
Problems | p. 365 |
10 Bayesian Processors for Physics-Based Applications | p. 369 |
10.1 Optimal Position Estimation for the Automatic Alignment | p. 369 |
10.1.1 Background | p. 369 |
10.1.2 Stochastic Modeling of Position Measurements | p. 372 |
10.1.3 Bayesian Position Estimation and Detection | p. 374 |
10.1.4 Application: Beam Line Data | p. 375 |
10.1.5 Results: Beam Line (KDP Deviation) Data | p. 377 |
10.1.6 Results: Anomaly Detection | p. 379 |
10.2 Broadband Ocean Acoustic Processing | p. 382 |
10.2.1 Background | p. 382 |
10.2.2 Broadband State-Space Ocean Acoustic Propagators | p. 384 |
10.2.3 Broadband Bayesian Processing | p. 389 |
10.2.4 Broadband BSP Design | p. 393 |
10.2.5 Results | p. 395 |
10.3 Bayesian Processing for Biothreats | p. 397 |
10.3.1 Background | p. 397 |
10.3.2 Parameter Estimation | p. 400 |
10.3.3 Bayesian Processor Design | p. 401 |
10.3.4 Results | p. 403 |
10.4 Bayesian Processing for the Detection of Radioactive Sources | p. 404 |
10.4.1 Background | p. 404 |
10.4.2 Physics-Based Models | p. 404 |
10.4.3 Gamma-Ray Detector Measurements | p. 407 |
10.4.4 Bayesian Physics-Based Processor | p. 410 |
10.4.5 Physics-Based Bayesian Deconvolution Processor | p. 412 |
10.4.6 Results | p. 415 |
References | p. 417 |
Appendix A Probability & Statistics Overview | p. 423 |
A.1 Probability Theory | p. 423 |
A.2 Gaussian Random Vectors | p. 429 |
A.3 Uncorrelated Transformation: Gaussian Random Vectors | p. 430 |
References | p. 430 |
Index | p. 431 |