Skip to:Content
|
Bottom
Cover image for Metaheuristics : from design to implementation
Title:
Metaheuristics : from design to implementation
Personal Author:
Physical Description:
xxix, 593 pages : illustrations ; 25 cm
ISBN:
9780470278581

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
33000000003483 QA402.5 T35 2009 Open Access Book Gift Book
Searching...

On Order

Summary

Summary

A unified view of metaheuristics

This book provides a complete background on metaheuristics and shows readers how to design and implement efficient algorithms to solve complex optimization problems across a diverse range of applications, from networking and bioinformatics to engineering design, routing, and scheduling. It presents the main design questions for all families of metaheuristics and clearly illustrates how to implement the algorithms under a software framework to reuse both the design and code.

Throughout the book, the key search components of metaheuristics are considered as a toolbox for:

Designing efficient metaheuristics (e.g. local search, tabu search, simulated annealing, evolutionary algorithms, particle swarm optimization, scatter search, ant colonies, bee colonies, artificial immune systems) for optimization problems

Designing efficient metaheuristics for multi-objective optimization problems

Designing hybrid, parallel, and distributed metaheuristics

Implementing metaheuristics on sequential and parallel machines

Using many case studies and treating design and implementation independently, this book gives readers the skills necessary to solve large-scale optimization problems quickly and efficiently. It is a valuable reference for practicing engineers and researchers from diverse areas dealing with optimization or machine learning; and graduate students in computer science, operations research, control, engineering, business and management, and applied mathematics.


Author Notes

EL-GHAZALI TALBI is a full Professor in Computer Science at the University of Lille (France), and head of the optimization group of the Computer Science Laboratory (L.I.F.L.). His current research interests are in the fields of metaheuristics, parallel algorithms, multi-objective combinatorial optimization, cluster and grid computing, hybrid and cooperative optimization, and application to bioinformatics, networking, transportation, and logistics. He is the founder of the conference META (International Conference on Metaheuristics and Nature Inspired Computing), and is head of the INRIA Dolphin project dealing with robust multi-objective optimization of complex systems.


Table of Contents

Preface
Acknowledgments
Glossary
1 Common Concepts for Metaheuristics
1.1 Optimization Models
1.1.1 Classical Optimization Models
1.1.2 Complexity Theory
1.1.2.1 Complexity of Algorithms
1.1.2.2 Complexity of Problems
1.2 Other Models for Optimization
1.2.1 Optimization Under Uncertainty
1.2.2 Dynamic Optimization
1.2.2.1 Multiperiodic Optimization
1.2.3 Robust Optimization
1.3 Optimization Methods
1.3.1 Exact Methods
1.3.2 Approximate Algorithms
1.3.2.1 Approximation Algorithms
1.3.3 Metaheuristics
1.3.4 Greedy Algorithms
1.3.5 When Using Metaheuristics
1.4 Main Common Concepts for Metaheuristics
1.4.1 Representation
1.4.1.1 Linear Representations
1.4.1.2 Nonlinear Representations
1.4.1.3 Representation-Solution Mapping
1.4.1.4 Direct Versus Indirect Encodings
1.4.2 Objective Function
1.4.2.1 Self-Sufficient Objective Functions
1.4.2.2 Guiding Objective Functions
1.4.2.3 Representation Decoding
1.4.2.4 Interactive Optimization
1.4.2.5 Relative and Competitive Objective Functions
1.4.2.6 Meta-Modeling
1.5 Constraint Handling
1.5.1 Reject Strategies
1.5.2 Penalizing Strategies
1.5.3 Repairing Strategies
1.5.4 Decoding Strategies
1.5.5 Preserving Strategies
1.6 Parameter Tuning
1.6.1 Off-Line Parameter Initialization
1.6.2 Online Parameter Initialization
1.7 Performance Analysis of Metaheuristics
1.7.1 Experimental Design
1.7.2 Measurement
1.7.2.1 Quality of Solutions
1.7.2.2 Computational Effort
1.7.2.3 Robustness
1.7.2.4 Statistical Analysis
1.7.2.5 Ordinal Data Analysis
1.7.3 Reporting
1.8 Software Frameworks for Metaheuristics
1.8.1 Why a Software Framework for Metaheuristics?
1.8.2 Main Characteristics of Software Frameworks
1.8.3 ParadisEO Framework
1.8.3.1 ParadisEO Architecture
1.9 Conclusions
1.10 Exercises
2 Single-Solution Based Metaheuristics
2.1 Common Concepts for Single-Solution Based Metaheuristics
2.1.1 Neighborhood
2.1.2 Very Large Neighborhoods
2.1.2.1 Heuristic Search in Large Neighborhoods
2.1.2.2 Exact Search in Large Neighborhoods
2.1.2.3 Polynomial-Specific Neighborhoods
2.1.3 Initial Solution
2.1.4 Incremental Evaluation of the Neighborhood
2.2 Fitness Landscape Analysis
2.2.1 Distances in the Search Space
2.2.2 Landscape Properties
2.2.2.1 Distribution Measures
2.2.2.2 Correlation Measures
2.2.3 Breaking Plateaus in a Flat Landscape
2.3 Local Search
2.3.1 Selection of the Neighbor
2.3.2 Escaping from Local Optima
2.4 Simulated Annealing
2.4.1 Move Acceptance
2.4.2 Cooling Schedule
2.4.2.1 Initial Temperature
2.4.2.2 Equilibrium State
2.4.2.3 Cooling
2.4.2.4 Stopping Condition
2.4.3 Other Similar Methods
2.4.3.1 Threshold Accepting
2.4.3.2 Record-to-Record Travel
2.4.3.3 Great Deluge Algorithm
2.4.3.4 Demon Algorithms
2.5 Tabu Search
2.5.1 Short-Term Memory
2.5.2 Medium-Term Memory
2.5.3 Long-Term Memory
2.6 Iterated Local Search
2.6.1 Perturbation Method
2.6.2 Acceptance Criteria
2.7 Variable Neighborhood Search
2.7.1 Variable Neighborhood Descent
2.7.2 General Variable Neighborhood Search
2.8 Guided Local Search
2.9 Other Single-Solution Based Metaheuristics
2.9.1 Smoothing Methods
2.9.2 Noisy Method
2.9.3 GRASP
2.10 S-Metaheuristic Implementation Under ParadisEO
2.10.1 Common Templates for Metaheuristics
2.10.2 Common Templates for S-Metaheuristics
2.10.3 Local Search Template
2.10.4 Simulated Annealing Template
2.10.5 Tabu Search Template
2.10.6 Iterated Local Search Template
2.11 Conclusions
2.12 Exercises
3 Population-Based Metaheuristics
3.1 Common Concepts for Population-Based Metaheuristics
3.1.1 Initial Population
3.1.1.1 Random Generation
3.1.1.2 Sequential Diversification
3.1.1.3 Parallel Diversification
3.1.1.4 Heuristic Initialization
3.1.2 Stopping Criteria
3.2 Evolutionary Algorithms
3.2.1 Genetic Algorithms
3.2.2 Evolution Strategies
3.2.3 Evolutionary Programming
3.2.4 Genetic Programming
3.3 Common Concepts for Evolutionary Algorithms
3.3.1 Selection Methods
3.3.1.1 Roulette Wheel Selection
3.3.1.2 Stochastic Universal Sampling
3.3.1.3 Tournament Selection
3.3.1.4 Rank-Based Selection
3.3.2 Reproduction
3.3.2.1 Mutation
3.3.2.2 Recombination or Crossover
3.3.3 Replacement Strategies
3.4 Other Evolutionary Algorithms
3.4.1 Estimation of Distribution Algorithms
3.4.2 Differential Evolution
3.4.3 Coevolutionary Algorithms
3.4.4 Cultural Algorithms
3.5 Scatter Search
3.5.1 Path Relinking
3.6 Swarm Intelligence
3.6.1 Ant Colony Optimization Algorithms
3.6.1.1 ACO for Continuous Optimization Problems
3.6.2 Particle Swarm Optimization
3.7.2 Artificial Immune Systems
3.7.2.1 Natural Immune System
3.7.2.2 Clonal Selection Theory
3.7.2.3 Negative Selection Principle
3.7.2.4 Immune Network Theory
3.7.2.5 Danger Theory
3.8 P-metaheuristics Implementation under ParadisEO
3.8.1 Common Components and Programming Hints
3.8.1.1 Main Core TemplatesùParadisEO-EOÆs Functors
3.8.1.2 Representation
3.8.2 Fitness Function
3.8.2.1 Initialization
3.8.2.2 Stopping Criteria, Checkpoints, and Statistics
3.8.2.3 Dynamic Parameter Management and State Loader/Register
3.8.3 Evolutionary Algorithms Under ParadisEO
3.8.3.1 Representation
3.8.3.2 Initialization
3.8.3.3 Evaluation
3.8.3.4 Variation Operators
3.8.3.5 Evolution Engine
3.8.3.6 Evolutionary Algorithms
3.8.4 Particle Swarm Optimization Under ParadisEO
3.8.4.1 Illustrative Example
3.8.5 Estimation of Distribution Algorithm Under ParadisEO
3.9 Conclusions
3.10 Exercises
4 Metaheuristics for Multiobjective Optimization
4.1 Multiobjective Optimization Concepts
4.2 Multiobjective Optimization Problems
4.2.1 Academic Applications
4.2.1.1 Multiobjective Continuous Problems
4.2.1.2 Multiobjective Combinatorial Problems
4.2.2 Real-Life Applications
4.2.3 Multicriteria Decision Making
4.3 Main Design Issues of Multiobjective Metaheuristics
4.4 Fitness Assignment Strategies
4.4.1 Scalar Approaches
4.4.1.1 Aggregation Method
4.4.1.2 Weighted Metrics
4.4.1.3 Goal Programming
4.4.1.4 Achievement Functions
4.4.1.5 Goal Attainment
4.4.1.6 -Constraint Method
4.4.2 Criterion-Based Methods
4.4.2.1 Parallel Approach
4.4.2.2 Sequential or Lexicographic Approach
4.4.3 Dominance-Based Approaches
4.4.4 Indicator-Based Approaches
4.5 Diversity Preservation
4.5.1 Kernel Methods
4.5.2 Nearest-Neighbor Methods
4.5.3 Histograms
4.6 Elitism
4.7 Performance Evaluation and Pareto Front Structure
4.7.1 Performance Indicators
4.7.1.1 Convergence-Based Indicators
4.7.1.2 Diversity-Based Indicators
4.7.1.3 Hybrid Indicators
4.7.2 Landscape Analysis of Pareto Structures
4.8 Multiobjective Metaheuristics Under ParadisEO
4.8.1 Software Frameworks for Multiobjective Metaheuristics
4.8.2 Common Components
4.8.2.1 Representation
4.8.2.2 Fitness Assignment Schemes
4.8.2.3 Diversity Assignment Schemes
4.8.2.4 Elitism
4.8.2.5 Statistical Tools
4.8.3 Multiobjective EAs-Related Components
4.8.3.1 Selection Schemes
4.8.3.2 Replacement Schemes
4.8.3.3 Multiobjective Evolutionary Algorithms
4.9 Conclusions and Perspectives
4.1.0 Exercises
5 Hybrid Metaheuristics
5.1 Hybrid Metaheuristics
5.1.1 Design Issues
5.1.1.1 Hierarchical Classification
5.1.1.2 Flat Classification
5.1.2 Implementation Issues
5.1.2.1 Dedicated Versus General-Purpose Computers
5.1.2.2 Sequential Versus Parallel
5.1.3 A Grammar for Extended Hybridization Schemes
5.2 Combining Metaheuristics with Mathematical Programming
5.2.1 Mathematical Programming Approaches
5.2.1.1 Enumerative Algorithms
5.2.1.2 Relaxation and Decomposition Methods
5.2.1.3 Branch and Cut and Price Algorithms
5.2.2 Classical Hybrid Approaches
5.2.2.1 Low-Level Relay Hybrids
5.2.2.2 Low-Level Teamwork Hybrids
5.2.2.3 High-Level Relay Hybrids
5.2.2.4 High-Level Teamwork Hybrids
5.3 Combining Metaheuristics with Constraint Programming
5.3.1 Constraint Programming
5.3.2 Classical Hybrid Approaches
5.3.2.1 Low-Level Relay Hybrids
5.3.2.2 Low-Level Teamwork Hybrids
5.3.2.3 High-Level Relay Hybrids
5.3.2.4 High-Level Teamwork Hybrids
5.4 Hybrid Metaheuristics with Machine Learning and Data Mining
5.4.1 Data Mining Techniques
5.4.2 Main Schemes of Hybridization
5.4.2.1 Low-Level Relay Hybrid
5.4.2.2 Low-Level Teamwork Hybrids
5.4.2.3 High-Level Relay Hybrid
5.4.2.4 High-Level Teamwork Hybrid
5.5 Hybrid Metaheuristics for Multiobjective Optimization
5.5.1 Combining Metaheuristics for MOPs
5.5.1.1 Low-Level Relay Hybrids
5.5.1.2 Low-Level Teamwork Hybrids
5.5.1.3 High-Level Relay Hybrids
5.5.1.4 High-Level Teamwork Hybrid
5.5.2 Combining Metaheuristics with Exact Methods for MOP
5.5.3 Combining Metaheuristics with Data Mining for MOP
5.6 Hybrid Metaheuristics Under ParadisEO
5.6.1 Low-Level Hybrids Under ParadisEO
5.6.2 High-Level Hybrids Under ParadisEO
5.6.3 Coupling with Exact Algorithms
5.7 Conclusions and Perspectives
5.8 Exercises
6 Parallel Metaheuristics
6.1 Parallel Design of Metaheuristics
6.1.1 Algorithmic-Level Parallel Model
6.1.1.1 Independent Algorithmic-Level Parallel Model
6.1.1.2 Cooperative Algorithmic-Level Parallel Model
6.1.2 Iteration-Level Parallel Model
6.1.2.1 Iteration-Level Model for S-Metaheuristics
6.1.2.2 Iteration-Level Model for P-Metaheuristics
6.1.3 Solution-Level Parallel Model
6.1.4 Hierarchical Combination of the Parallel Models
6.2 Parallel Implementation of Metaheuristics
6.2.1 Parallel and Distributed Architectures
6.2.2 Dedicated Architectures
6.2.3 Parallel Programming Environments and Middlewares
6.2.4 Performance Evaluation
6.2.5 Main Properties of Parallel Metaheuristics
6.2.6 Algorithmic-Level Parallel Model
6.2.7 Iteration-Level Parallel Model
6.2.8 Solution-Level Parallel Model
6.3 Parallel Metaheuristics for Multiobjective Optimization
6.3.1 Algorithmic-Level Parallel Model for MOP
6.3.2 Iteration-Level Parallel Model for MOP
6.3.3 Solution-Level Parallel Model for MOP
6.3.4 Hierarchical Parallel Model for MOP
6.4 Parallel Metaheuristics Under ParadisEO
6.4.1 Parallel Frameworks for Metaheuristics
6.4.2 Design of Algorithmic-Level Parallel Models
6.4.2.1 Algorithms and Transferred Data (What?
6.4.2.2 Transfer Control (When?
6.4.2.3 Exchange Topology (Where?
6.4.2.4 Replacement Strategy (How?
6.4.2.5 Parallel Implementation
6.4.2.6 A Generic Example
6.4.2.7 Island Model of EAs Within ParadisEO
6.4.3 Design of Iteration-Level Parallel Models
6.4.3.1 The Generic Multistart Paradigm
6.4.3.2 Use of the Iteration-Level Model
6.4.4 Design of Solution-Level Parallel Models
6.4.5 Implementation of Sequential Metaheuristics
6.4.6 Implementation of Parallel and Distributed Algorithms
6.4.7 Deployment of ParadisEO-PEO
6.5 Conclusions and Perspectives
6.6 Exercises
Appendix: UML and C++
A.1 A Brief Overview of UML Notations
A.2 A Brief Overview of the C++ Template Concept
References
Index
Go to:Top of Page