Wiley.com
Print this page Share

Optimization in Engineering Sciences: Metaheuristic, Stochastic Methods and Decision Support

ISBN: 978-1-84821-498-9
446 pages
December 2014, Wiley-ISTE
Optimization in Engineering Sciences: Metaheuristic, Stochastic Methods and Decision Support (1848214987) cover image

Description

The purpose of this book is to present the main metaheuristics and approximate and stochastic methods for optimization of complex systems in Engineering Sciences. It has been written within the framework of the European Union project ERRIC (Empowering Romanian Research on Intelligent Information Technologies), which is funded by the EU’s FP7 Research Potential program and has been developed in co-operation between French and Romanian teaching researchers. Through the principles of various proposed algorithms (with additional references) this book allows the reader to explore various methods of implementation such as metaheuristics, local search and populationbased methods. It examines multi-objective and stochastic optimization, as well as methods and tools for computer-aided decision-making and simulation for decision-making.

See More

Table of Contents

LIST OF FIGURES ix

LIST OF TABLES xiii

LIST OF ALGORITHMS xv

LIST OF ACRONYMS xvii

PREFACE xix

ACKNOWLEDGEMENTS xxi

CHAPTER 1. METAHEURISTICS – LOCAL METHODS 1

1.1. Overview 1

1.2. Monte Carlo principle 6

1.3. Hill climbing 12

1.4. Taboo search 20

1.4.1. Principle 20

1.4.2. Greedy descent algorithm 20

1.4.3. Taboo search method 23

1.4.4. Taboo list 25

1.4.5. Taboo search algorithm 26

1.4.6. Intensification and diversification 30

1.4.7. Application examples 31

1.5. Simulated annealing 39

1.5.1. Principle of thermal annealing 39

1.5.2. Kirkpatrick’s model of thermal annealing 41

1.5.3. Simulated annealing algorithm 43

1.6. Tunneling 46

1.6.1. Tunneling principle 46

1.6.2. Types of tunneling 48

1.6.3. Tunneling algorithm 49

1.7. GRASP methods 51

CHAPTER 2. METAHEURISTICS – GLOBAL METHODS 53

2.1. Principle of evolutionary metaheuristics 53

2.2. Genetic algorithms 55

2.2.1. Biology breviary 55

2.2.2. Features of genetic algorithms 57

2.2.3. General structure of a GA 73

2.2.4. On the convergence of GA 77

2.2.5. How to implement a genetic algorithm 84

2.3. Hill climbing by evolutionary strategies 100

2.3.1. Climbing by the steepest ascent 101

2.3.2. Climbing by the next ascent 104

2.3.3. Hill climbing by group of alpinists 106

2.4. Optimization by ant colonies 107

2.4.1. Ant colonies 107

2.4.2. Basic optimization algorithm by ant colonies 110

2.4.3. Pheromone trail update 118

2.4.4. Systemic ant colony algorithm 122

2.4.5. Traveling salesman example 128

2.5. Particle swarm optimization 132

2.5.1. Basic metaheuristic 132

2.5.2. Standard PSO algorithm 141

2.5.3. Adaptive PSO algorithm with evolutionary strategy 146

2.5.4. Fireflies algorithm 163

2.5.5. Bats algorithm 173

2.5.6. Bees algorithm 182

2.5.7. Multivariable prediction by PSO 194

2.6. Optimization by harmony search 207

2.6.1. Musical composition and optimization 207

2.6.2. Harmony search model 208

2.6.3. Standard harmony search algorithm 212

2.6.4. Application example 215

CHAPTER 3. STOCHASTIC OPTIMIZATION 219

3.1. Introduction 219

3.2. Stochastic optimization problem 221

3.3. Computing the repartition function of a random variable 222

3.4. Statistical criteria for optimality 230

3.4.1. Case of totally admissible solutions 231

3.4.2. Case of partially admissible solutions 234

3.5. Examples 240

3.6. Stochastic optimization through games theory 245

3.6.1. Principle 245

3.6.2. Wald strategy (maximin) 247

3.6.3. Hurwicz strategy 248

3.6.4. Laplace strategy 249

3.6.5. Bayes–Laplace strategy 249

3.6.6. Savage strategy 250

3.6.7. Example 251

CHAPTER 4. MULTI-CRITERIA OPTIMIZATION 253

4.1. Introduction 253

4.2. Introductory examples 255

4.2.1. Choosing the first job 255

4.2.2. Selecting an IT tool 256

4.2.3. Setting the production rate of a continuous process plant 256

4.3. Multi-criteria optimization problems 257

4.3.1. Two subclasses of problems 257

4.3.2. Dominance and Pareto optimality 262

4.4. Model solving methods 265

4.4.1. Classifications 265

4.4.2. Substitution-based methods 266

4.4.3. Aggregation-based methods 270

4.4.4. Other methods 282

4.5. Two objective functions optimization for advanced control systems 292

4.5.1. Aggregating identification with the design of a dynamical control system 292

4.5.2. Aggregating decision model identification with the supervision 302

4.6. Notes and comments 307

CHAPTER 5. METHODS AND TOOLS FOR MODEL-BASED DECISION-MAKING 309

5.1. Introduction 309

5.2. Introductory examples 310

5.2.1. Choosing a job: probabilistic case 310

5.2.2. Starting a business 311

5.2.3. Selecting an IT engineer 311

5.3. Decisions and decision activities. 

5.3.1. Definition 313

5.3.2. Approaches 314

5.4. Decision analysis 316

5.4.1. Preliminary analysis: preparing the choice 317

5.4.2. Making a choice: structuring and solving decision problems 330

5.5. Notes and comments 347

5.6. Other remarks/comments 347

CHAPTER 6. DECISION-MAKING – CASE STUDY SIMULATION 351

6.1. Decision problem in uncertain environment 351

6.2. Problem statement 352

6.3. Simulation principle 353

6.4. Case studies 357

6.4.1. Stock management 358

6.4.2. Competitive tender 362

6.4.3. Queuing process or ATM 365

APPENDIX 1 369

APPENDIX 2 377

BIBLIOGRAPHY 393

INDEX 413

See More

More in this series

Back to Top