Skip to:Content
|
Bottom
Cover image for Random signals and noise : a mathematical introduction
Title:
Random signals and noise : a mathematical introduction
Personal Author:
Publication Information:
Boca Raton CRC Press, Taylor& Francis Group 2007
Physical Description:
vii, 216 p. : ill. ; 24 cm.
ISBN:
9780849375545

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010210071 TK5102.5 E55 2007 Open Access Book Book
Searching...
Searching...
30000010117128 TK5102.5 E55 2007 Book Book
Searching...
Searching...
30000010117129 TK5102.5 E55 2007 Open Access Book Book
Searching...

On Order

Summary

Summary

Understanding the nature of random signals and noise is critically important for detecting signals and for reducing and minimizing the effects of noise in applications such as communications and control systems. Outlining a variety of techniques and explaining when and how to use them, Random Signals and Noise: A Mathematical Introduction focuses on applications and practical problem solving rather than probability theory.

A Firm Foundation
Before launching into the particulars of random signals and noise, the author outlines the elements of probability that are used throughout the book and includes an appendix on the relevant aspects of linear algebra. He offers a careful treatment of Lagrange multipliers and the Fourier transform, as well as the basics of stochastic processes, estimation, matched filtering, the Wiener-Khinchin theorem and its applications, the Schottky and Nyquist formulas, and physical sources of noise.

Practical Tools for Modern Problems
Along with these traditional topics, the book includes a chapter devoted to spread spectrum techniques. It also demonstrates the use of MATLABĀ® for solving complicated problems in a short amount of time while still building a sound knowledge of the underlying principles.

A self-contained primer for solving real problems, Random Signals and Noise presents a complete set of tools and offers guidance on their effective application.


Table of Contents

Prefacep. xvii
1 Elementary Probability Theoryp. 1
1.1 The Probability Functionp. 1
1.2 A Bit of Philosophyp. 1
1.3 The One-Dimensional Random Variablep. 2
1.4 The Discrete Random Variable and the PMFp. 3
1.5 A Bit of Combinatoricsp. 4
1.5.1 An Introductory Examplep. 4
1.5.2 A More Systematic Approachp. 5
1.5.3 How Many Ways Can N Distinct Items Be Ordered?p. 6
1.5.4 How Many Distinct Subsets of N Elements Are There?p. 6
1.5.5 The Binomial Formulap. 7
1.6 The Binomial Distributionp. 7
1.7 The Continuous Random Variable, the CDF, and the PDFp. 9
1.8 The Expected Valuep. 12
1.9 Two Dimensional Random Variablesp. 17
1.9.1 The Discrete Random Variable and the PMFp. 18
1.9.2 The CDF and the PDFp. 19
1.9.3 The Expected Valuep. 20
1.9.4 Correlationp. 21
1.9.5 The Correlation Coefficientp. 21
1.10 The Characteristic Functionp. 22
1.11 Gaussian Random Variablesp. 24
1.12 Exercisesp. 26
2 An Introduction to Stochastic Processesp. 31
2.1 What Is a Stochastic Process?p. 31
2.2 The Autocorrelation Functionp. 33
2.3 What Does the Autocorrelation Function Tell Us?p. 33
2.4 The Evenness of the Autocorrelation Functionp. 34
2.5 Two Proofs that R[subscript XX](0) [greater than equal] | R[subscript XX]([tau])|p. 34
2.6 Some Examplesp. 36
2.7 Exercisesp. 38
3 The Weak Law of Large Numbersp. 41
3.1 The Markov Inequalityp. 41
3.2 Chebyshev's Inequalityp. 42
3.3 A Simple Examplep. 43
3.4 The Weak Law of Large Numbersp. 45
3.5 Correlated Random Variablesp. 47
3.6 Detecting a Constant Signal in the Presence of Additive Noisep. 49
3.7 A Method for Determining the CDF of a Random Variablep. 50
3.8 Exercisesp. 51
4 The Central Limit Theoremp. 55
4.1 Introductionp. 55
4.2 The Proof of the Central Limit Theoremp. 56
4.3 Detecting a Constant Signal in the Presence of Additive Noisep. 59
4.4 Detecting a (Particular) Non-Constant Signal in the Presence of Additive Noisep. 61
4.5 The Monte Carlo Methodp. 63
4.6 Poisson Convergencep. 64
4.7 Exercisesp. 68
5 Extrema and the Method of Lagrange Multipliersp. 73
5.1 The Directional Derivative and the Gradientp. 73
5.2 Over-Determined Systemsp. 74
5.2.1 General Theoryp. 74
5.2.2 Recovering a Constant from Noisy Samplesp. 75
5.2.3 Recovering a Line from Noisy Samplesp. 76
5.3 The Method of Lagrange Multipliersp. 77
5.3.1 Statement of the Resultp. 77
5.3.2 A Preliminary Resultp. 78
5.3.3 Proof of the Methodp. 80
5.4 The Cauchy-Schwarz Inequalityp. 83
5.5 Under-Determined Systemsp. 84
5.6 Exercisesp. 86
6 The Matched Filter for Stationary Noisep. 89
6.1 White Noisep. 89
6.2 Colored Noisep. 91
6.3 The Autocorrelation Matrixp. 96
6.4 The Effect of Sampling Many Times in a Fixed Intervalp. 97
6.5 More about the Signal to Noise Ratiop. 98
6.6 Choosing the Optimal Signal for a Given Noise Typep. 100
6.7 Exercisesp. 101
7 Fourier Series and Transformsp. 105
7.1 The Fourier Seriesp. 105
7.2 The Functions e[subscript n] (t) Span-A Plausibility Argumentp. 108
7.3 The Fourier Transformp. 111
7.4 Some Properties of the Fourier Transformp. 112
7.5 Some Fourier Transformsp. 115
7.6 A Connection between the Time and Frequency Domainsp. 119
7.7 Preservation of the Inner Productp. 120
7.8 Exercisesp. 121
8 The Wiener-Khinchin Theorem and Applicationsp. 125
8.1 The Periodic Casep. 125
8.2 The Aperiodic Casep. 128
8.3 The Effect of Filteringp. 129
8.4 The Significance of the Power Spectral Densityp. 130
8.5 White Noisep. 131
8.6 Low-Pass Noisep. 131
8.7 Low-Pass Filtered Low-Pass Noisep. 132
8.8 The Schottky Formula for Shot Noisep. 133
8.9 A Semi-Practical Examplep. 135
8.10 Johnson Noise and the Nyquist Formulap. 138
8.11 Why Use RMS Measurementsp. 140
8.12 The Practical Resistor as a Circuit Elementp. 141
8.13 The Random Telegraph Signal-Another Low-Pass Signalp. 143
8.14 Exercisesp. 144
9 Spread Spectrump. 149
9.1 Introductionp. 149
9.2 The Probabilistic Approachp. 150
9.3 A Spread Spectrum Signal with Narrow Band Noisep. 151
9.4 The Effect of Multiple Transmittersp. 153
9.5 Spread Spectrum-The Deterministic Approachp. 155
9.6 Finite State Machinesp. 156
9.7 Modulo Two Recurrence Relationsp. 157
9.8 A Simple Examplep. 158
9.9 Maximal Length Sequencesp. 158
9.10 Determining the Periodp. 160
9.11 An Examplep. 161
9.12 Some Conditions for Maximalityp. 162
9.13 What We Have Not Discussedp. 163
9.14 Exercisesp. 163
10 More about the Autocorrelation and the PSDp. 165
10.1 The "Positivity" of the Autocorrelationp. 165
10.2 Another Proof that R[subscript XX](0) [greater than equal] | R[subscript XX]([tau])|p. 166
10.3 Estimating the PSDp. 166
10.4 The Properties of the Periodogramp. 168
10.5 Exercisesp. 169
11 Wiener Filtersp. 171
11.1 A Non-Causal Solutionp. 171
11.2 White Noise and a Low-Pass Signalp. 174
11.3 Causality, Anti-Causality and the Fourier Transformp. 175
11.4 The Optimal Causal Filterp. 177
11.5 Two Examplesp. 179
11.5.1 White Noise and a Low-Pass Signalp. 179
11.5.2 Low-Pass Signal and Noisep. 180
11.6 Exercisesp. 181
A A Brief Overview of Linear Algebrap. 185
A.1 The Space C[superscript N]p. 185
A.2 Linear Independence and Basesp. 186
A.3 A Preliminary Resultp. 187
A.4 The Dimension of C[superscript N]p. 188
A.5 Linear Mappingsp. 189
A.6 Matricesp. 190
A.7 Sums of Mappings and Sums of Matricesp. 191
A.8 The Composition of Linear Mappings-Matrix Multiplicationp. 192
A.9 A Very Special Matrixp. 193
A.10 Solving Simultaneous Linear Equationsp. 193
A.11 The Inverse of a Linear Mappingp. 196
A.12 Invertibilityp. 197
A.13 The Determinant-A Test for Invertibilityp. 199
A.14 Eigenvectors and Eigenvaluesp. 200
A.15 The Inner Productp. 202
A.16 A Simple Proof of the Cauchy-Schwarz Inequalityp. 203
A.17 The Hermitian Transpose of a Matrixp. 204
A.18 Some Important Properties of Self-Adjoint Matricesp. 205
A.19 Exercisesp. 206
Bibliographyp. 209
Indexp. 212
Go to:Top of Page