Skip to:Content
|
Bottom
Cover image for Control theory in physics and other fields of science : concepts, tools, and applications
Title:
Control theory in physics and other fields of science : concepts, tools, and applications
Personal Author:
Series:
Springer tracts in modern physics ; 215
Publication Information:
Berlin : Springer, 2006
ISBN:
9783540295143

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010106900 QC1 S33 2006 Open Access Book Book
Searching...
Searching...
30000010113175 QC1 S33 2006 Open Access Book Book
Searching...

On Order

Summary

Summary

This book provides an introduction to the analysis and the control mechanism of physical, chemical, biological, technological and economic models and their nonequilibrium evolution dynamics. Strong emphasis is placed on the foundation of variational principles, evolution and control equations, numerical methods, statistical concepts and techniques for solving or estimation of stochastic control problems for systems with a high degree of complexity. In particular, the central aim of this book is developing a synergetic connection between theoretical concepts and real applications. This book is a modern introduction and a helpful tool for researchers as well as for graduate students interested in econophysics and related topics.


Table of Contents

1 Introductionp. 1
1.1 The Aim of Control Theoryp. 1
1.2 Dynamic State of Classical Mechanical Systemsp. 3
1.3 Dynamic State of Complex Systemsp. 6
1.3.1 What Is a Complex System?p. 6
1.3.2 Relevant and Irrelevant Degrees of Freedomp. 9
1.3.3 Quasi-Deterministic Versus Quasi-Stochastic Evolutionp. 10
1.4 The Physical Approach to Control Theoryp. 13
Referencesp. 14
2 Deterministic Control Theoryp. 17
2.1 Introduction: The Brachistochrone Problemp. 17
2.2 The Deterministic Control Problemp. 19
2.2.1 Functionals, Constraints, and Boundary Conditionsp. 19
2.2.2 Weak and Strong Minimap. 20
2.3 The Simplest Control Problem: Classical Mechanicsp. 22
2.3.1 Euler-Lagrange Equationsp. 22
2.3.2 Optimum Criterionp. 24
2.3.3 One-Dimensional Systemsp. 30
2.4 General Optimum Control Problemp. 33
2.4.1 Lagrange Approachp. 33
2.4.2 Hamilton Approachp. 40
2.4.3 Pontryagin's Maximum Principlep. 42
2.4.4 Applications of the Maximum Principlep. 45
2.4.5 Controlled Molecular Dynamic Simulationsp. 53
2.5 The Hamilton-Jacobi Equationp. 55
Referencesp. 59
3 Linear Quadratic Problemsp. 61
3.1 Introduction to Linear Quadratic Problemsp. 61
3.1.1 Motivationp. 61
3.1.2 The Performance Functionalp. 62
3.1.3 Stability Analysisp. 63
3.1.4 The General Solution of Linear Quadratic Problemsp. 71
3.2 Extensions and Applicationsp. 73
3.2.1 Modifications of the Performancep. 73
3.2.2 Inhomogeneous Linear Evolution Equationsp. 75
3.2.3 Scalar Problemsp. 75
3.3 The Optimal Regulatorp. 77
3.3.1 Algebraic Ricatti Equationp. 77
3.3.2 Stability of Optimal Regulatorsp. 79
3.4 Control of Linear Oscillations and Relaxationsp. 81
3.4.1 Integral Representation of State Dynamicsp. 81
3.4.2 Optimal Control of Generalized Linear Evolution Equationsp. 85
3.4.3 Perturbation Theory for Weakly Nonlinear Dynamicsp. 88
Referencesp. 90
4 Control of Fieldsp. 93
4.1 Field Equationsp. 93
4.1.1 Classical Field Theoryp. 93
4.1.2 Hydrodynamic Field Equationsp. 99
4.1.3 Other Field Equationsp. 101
4.2 Control by External Sourcesp. 103
4.2.1 General Aspectsp. 103
4.2.2 Control Without Spatial Boundariesp. 104
4.2.3 Passive Boundary Conditionsp. 114
4.3 Control via Boundary Conditionsp. 116
Referencesp. 118
5 Chaos Controlp. 123
5.1 Characterization of Trajectories in the Phase Spacep. 123
5.1.1 General Problemsp. 123
5.1.2 Conservative Hamiltonian Systemsp. 124
5.1.3 Nonconservative Systemsp. 126
5.2 Time-Discrete Chaos Controlp. 128
5.2.1 Time Continuous Control Versus Time Discrete Controlp. 128
5.2.2 Chaotic Behavior of Time Discrete Systemsp. 132
5.2.3 Control of Time Discrete Equationsp. 135
5.2.4 Reachability and Stabilizabilityp. 137
5.2.5 Observabilityp. 140
5.3 Time-Continuous Chaos Controlp. 141
5.3.1 Delayed Feedback Controlp. 141
5.3.2 Synchronizationp. 144
Referencesp. 146
6 Nonequilibrium Statistical Physicsp. 149
6.1 Statistical Approach to Phase Space Dynamicsp. 149
6.1.1 The Probability Distributionp. 149
6.2 The Liouville Equationp. 152
6.3 Generalized Rate Equationsp. 153
6.3.1 Probability Distribution of Relevant Quantitiesp. 153
6.3.2 The Formal Solution of the Liouville Equationp. 155
6.3.3 The Nakajima-Zwanzig Equationp. 156
6.4 Notation of Probability Theoryp. 161
6.4.1 Measures of Central Tendencyp. 161
6.4.2 Measure of Fluctuations around the Central Tendencyp. 162
6.4.3 Moments and Characteristic Functionsp. 162
6.4.4 Cumulantsp. 163
6.5 Combined Probabilitiesp. 164
6.5.1 Conditional Probabilityp. 164
6.5.2 Joint Probabilityp. 165
6.6 Markov Approximationp. 167
6.7 Generalized Fokker-Planck Equationp. 169
6.7.1 Differential Chapman-Kolmogorov Equationp. 169
6.7.2 Deterministic Processesp. 173
6.7.3 Markov Diffusion Processesp. 174
6.7.4 Jump Processesp. 175
6.8 Correlation and Stationarityp. 176
6.8.1 Stationarityp. 176
6.8.2 Correlationp. 177
6.8.3 Spectrap. 178
6.9 Stochastic Equations of Motionsp. 179
6.9.1 The Mori-Zwanzig Equationp. 179
6.9.2 Separation of Time Scalesp. 182
6.9.3 Wiener Processp. 183
6.9.4 Stochastic Differential Equationsp. 185
6.9.5 Ito's Formula and Fokker-Planck Equationp. 189
Referencesp. 191
7 Optimal Control of Stochastic Processesp. 193
7.1 Markov Diffusion Processes under Controlp. 193
7.1.1 Information Level and Control Mechanismsp. 193
7.1.2 Path Integralsp. 194
7.1.3 Performancep. 197
7.2 Optimal Open Loop Controlp. 199
7.2.1 Mean Performancep. 199
7.2.2 Tree Approximationp. 201
7.3 Feedback Controlp. 204
7.3.1 The Control Equationp. 204
7.3.2 Linear Quadratic Problemsp. 210
Referencesp. 211
8 Filters and Predictorsp. 213
8.1 Partial Uncertainty of Controlled Systemsp. 213
8.2 Gaussian Processesp. 215
8.2.1 The Central Limit Theoremp. 215
8.2.2 Convergence Problemsp. 220
8.3 Lévy Processesp. 223
8.3.1 Form-Stable Limit Distributionsp. 223
8.3.2 Convergence to Stable Lévy Distributionsp. 226
8.3.3 Truncated Lévy Distributionsp. 227
8.4 Rare Eventsp. 228
8.4.1 The Cramér Theoremp. 228
8.4.2 Extreme Fluctuationsp. 230
8.5 Kalman Filterp. 232
8.5.1 Linear Quadratic Problems with Gaussian Noisep. 232
8.5.2 Estimation of the System Statep. 232
8.5.3 Ljapunov Differential Equationp. 237
8.5.4 Optimal Control Problem for Kalman Filtersp. 239
8.6 Filters and Predictorsp. 243
8.6.1 General Filter Conceptsp. 243
8.6.2 Wiener Filtersp. 244
8.6.3 Estimation of the System Dynamicsp. 245
8.6.4 Regression and Autoregressionp. 246
8.6.5 The Bayesian Conceptp. 249
8.6.6 Neural Networksp. 251
Referencesp. 261
9 Game Theoryp. 265
9.1 Unpredictable Systemsp. 265
9.2 Optimal Control and Decision Theoryp. 267
9.2.1 Nondeterministic and Probabilistic Regimep. 267
9.2.2 Strategiesp. 269
9.3 Zero-Sum Gamesp. 271
9.3.1 Two-Player Gamesp. 271
9.3.2 Deterministic Strategyp. 272
9.3.3 Random Strategyp. 273
9.4 Nonzero-Sum Gamesp. 274
9.4.1 Nash Equilibriump. 274
9.4.2 Random Nash Equilibriap. 276
Referencesp. 276
10 Optimization Problemsp. 279
10.1 Notations of Optimization Theoryp. 279
10.1.1 Introductionp. 279
10.1.2 Convex Objectsp. 280
10.2 Optimization Methodsp. 282
10.2.1 Extremal Solutions Without Constraintsp. 282
10.2.2 Extremal Solutions with Constraintsp. 285
10.2.3 Linear Programmingp. 286
10.2.4 Combinatorial Optimization Problemsp. 287
10.2.5 Evolution Strategiesp. 289
Referencesp. 292
Go to:Top of Page