Available:*
Library | Item Barcode | Call Number | Material Type | Item Category 1 | Status |
---|---|---|---|---|---|
Searching... | 30000010194281 | QA76.618 K73 2008 | Open Access Book | Book | Searching... |
On Order
Summary
Summary
Evolutionary algorithms are successful biologically inspired meta-heuristics. Their success depends on adequate parameter settings. The question arises: how can evolutionary algorithms learn parameters automatically during the optimization? Evolution strategies gave an answer decades ago: self-adaptation. Their self-adaptive mutation control turned out to be exceptionally successful. But nevertheless self-adaptation has not achieved the attention it deserves.
This book introduces various types of self-adaptive parameters for evolutionary computation. Biased mutation for evolution strategies is useful for constrained search spaces. Self-adaptive inversion mutation accelerates the search on combinatorial TSP-like problems. After the analysis of self-adaptive crossover operators the book concentrates on premature convergence of self-adaptive mutation control at the constraint boundary. Besides extensive experiments, statistical tests and some theoretical investigations enrich the analysis of the proposed concepts.
Table of Contents
1 Introduction | p. 1 |
1.1 Motivation | p. 1 |
1.2 A Survey of This Book | p. 2 |
Part I Foundations of Evolutionary Computation | |
2 Evolutionary Algorithms | p. 9 |
2.1 Introduction to Evolutionary Computation | p. 9 |
2.1.1 Computational Intelligence | p. 9 |
2.1.2 Optimization Problems and Stochastic Convergence | p. 11 |
2.1.3 Classic Optimization Methods | p. 12 |
2.1.4 A Short Excursus to Molecular Biology and Genetics | p. 13 |
2.1.5 Concepts of the Evolutionary Computation Framework | p. 13 |
2.1.6 Types of Evolutionary Algorithms | p. 17 |
2.2 Evolution Strategies | p. 21 |
2.2.1 The ([mu]/[rho superscript +], [lambda])-ES | p. 21 |
2.2.2 The ([mu], [kappa], [lambda], [rho])-ES | p. 22 |
2.3 Practical Guidelines for Evolutionary Algorithms | p. 22 |
2.4 Theories of Evolutionary Algorithms | p. 24 |
2.5 Summary | p. 27 |
3 Self-Adaptation | p. 29 |
3.1 History of Parameter Adaptation | p. 29 |
3.2 An Extended Taxonomy of Parameter Setting Techniques | p. 30 |
3.2.1 Preliminary Work | p. 30 |
3.2.2 Parameter Tuning | p. 31 |
3.2.3 Parameter Control | p. 32 |
3.3 Typical Adapted Parameters | p. 35 |
3.4 The Concept of Self-Adaptation | p. 37 |
3.5 Self-Adaptation of Global Parameters | p. 40 |
3.6 Theoretical Approaches Toward Self-Adaptation | p. 40 |
3.7 A Self-Adaptive Estimation of Distribution View on EAs | p. 42 |
3.7.1 Preliminary Views on Self-Adaptation | p. 42 |
3.7.2 Estimation of Distribution Algorithms | p. 43 |
3.7.3 Self-Adaptation: Evolving the Estimation of Distribution | p. 43 |
3.7.4 Views on the Proposed Operators and Problems of This Work | p. 44 |
3.8 Premature Convergence | p. 45 |
3.9 Summary | p. 46 |
Part II Self-Adaptive Operators | |
4 Biased Mutation for Evolution Strategies | p. 51 |
4.1 Mutation Operators for Evolution Strategies | p. 51 |
4.1.1 Uncorrelated Isotropic Gaussian Mutation with One Step Size | p. 53 |
4.1.2 Uncorrelated Gaussian Mutation with N Step Sizes | p. 54 |
4.1.3 Correlated Mutation | p. 54 |
4.1.4 Asymmetric Density Functions - Directed Mutation | p. 55 |
4.1.5 Cauchy Mutation | p. 56 |
4.1.6 Covariance Matrix Adaptation (CMA) | p. 56 |
4.1.7 Self-Adaptive Mutation for Binary Representations | p. 58 |
4.1.8 Mutation Operators for Strategy Parameters | p. 58 |
4.2 The Biased Mutation Operator | p. 58 |
4.2.1 BMO Concept | p. 59 |
4.2.2 Sphere Biased Mutation Operator (sBMO) | p. 60 |
4.2.3 Cube Biased Mutation Operator (cBMO) | p. 61 |
4.2.4 Comparison of Computational Effort of the Randomized Operators | p. 61 |
4.3 The Descent Direction Mutation Operator (DMO) | p. 62 |
4.4 Success Rate on Monotone Functions | p. 63 |
4.5 Experimental Analysis | p. 64 |
4.5.1 Unconstrained Real Parameter Optimization | p. 65 |
4.5.2 Climbing Ridges with Biased Mutation | p. 71 |
4.5.3 Handling Constraints with Biased Mutation | p. 74 |
4.6 Excursus: Self-Adaptive Mutation Operator Selection | p. 79 |
4.7 Summary | p. 80 |
5 Self-Adaptive Inversion Mutation | p. 81 |
5.1 Introduction | p. 81 |
5.1.1 The Traveling Salesman Problem | p. 81 |
5.1.2 Evolutionary Combinatorial Optimization | p. 82 |
5.1.3 Self-Adaptation for Discrete Strategy Variables | p. 82 |
5.2 Self-Adaptive Inversion Mutation | p. 83 |
5.3 Convergence Properties | p. 84 |
5.4 Experimental Analysis | p. 86 |
5.4.1 TSP Instance Berlin52 | p. 87 |
5.4.2 TSP Instance Bier127 | p. 88 |
5.4.3 TSP Instance Gr666 | p. 89 |
5.4.4 Small TSP Instances | p. 92 |
5.4.5 Statistical Test | p. 92 |
5.5 The Strategy Bound Problem and SA-INV-c | p. 92 |
5.6 Summary | p. 94 |
6 Self-Adaptive Crossover | p. 97 |
6.1 The Self-Adaptive Crossover Concept | p. 97 |
6.1.1 The Role of Crossover - Building Blocks or Genetic Repair? | p. 98 |
6.1.2 Preliminary Work | p. 99 |
6.1.3 Adaptation of the Crossover Structure | p. 99 |
6.2 Self-Adaptive n-Point Crossover | p. 101 |
6.2.1 SA-1-Point | p. 101 |
6.2.2 SA-n-Point | p. 103 |
6.2.3 Self-Adaptive Uniform and Multi Parent Crossover | p. 103 |
6.3 Self-Adaptive Partially Mapped Crossover | p. 105 |
6.4 Self-Adaptive Recombination for Evolution Strategies (SAR) | p. 107 |
6.4.1 Intermediate and Dominant Recombination | p. 108 |
6.4.2 Self-Adaptive Recombination | p. 108 |
6.4.3 SAR Variants | p. 109 |
6.4.4 Experimental Analysis | p. 109 |
6.5 Crossover Point Optimization | p. 111 |
6.6 Summary | p. 112 |
Part III Constraint Handling | |
7 Constraint Handling Heuristics for Evolution Strategies | p. 117 |
7.1 The NLP Problem | p. 117 |
7.2 A Short Survey of Constraint-Handling Methods | p. 118 |
7.2.1 Penalty Functions | p. 118 |
7.2.2 Penalty-Related Methods | p. 119 |
7.2.3 Repair Algorithms | p. 119 |
7.2.4 Decoder Functions | p. 119 |
7.2.5 Multiobjective Optimization | p. 120 |
7.2.6 Constraint Preserving Operators and Representations | p. 120 |
7.2.7 Recent Developments | p. 120 |
7.3 Premature Step Size Reduction | p. 120 |
7.3.1 Experimental Analysis | p. 121 |
7.3.2 Theoretical Analysis | p. 121 |
7.4 The Death Penalty Step Control Approach | p. 127 |
7.4.1 Basic Minimum Step Size Reduction Mechanism | p. 127 |
7.4.2 Experimental Analysis | p. 128 |
7.5 Constraint-Handling with Two Sexes | p. 131 |
7.5.1 Biologically Inspired Constraint-Handling | p. 131 |
7.5.2 Modifications of the Basic Two Sexes Evolution Strategy | p. 132 |
7.5.3 Experimental Analysis | p. 133 |
7.6 The Nested Angle Evolution Strategy | p. 137 |
7.6.1 Meta-evolution for Mutation Ellipsoid Rotation | p. 137 |
7.6.2 Experimental Analysis | p. 138 |
7.6.3 Outlook: Covariance Matrix Adaptation by Feasibility Classification | p. 139 |
7.7 Summary | p. 139 |
Part IV Summary | |
8 Summary and Conclusion | p. 143 |
8.1 Contributions of This Book | p. 143 |
8.2 Conclusion | p. 146 |
Part V Appendix | |
A Continuous Benchmark Functions | p. 149 |
A.1 Unimodal Numerical Functions | p. 149 |
A.2 Multimodal Numerical Functions | p. 150 |
A.3 Ridge Functions | p. 152 |
A.4 Numerical Functions in Bit String Representations | p. 153 |
A.5 Constrained Numerical Functions | p. 154 |
B Discrete Benchmark Functions | p. 159 |
B.1 Traveling Salesman Problems | p. 159 |
B.2 SAT-Problem | p. 161 |
References | p. 163 |
List of Figures | p. 173 |
List of Tables | p. 175 |
Index | p. 179 |