1 07-ParameterTuning


Which parameters are best to initialize with?

1.1 Parameters and Parameter Tuning

1.2 Brief historical account

1970/80 “GA is a robust method”
1970 + ESs self-adapt mutation step size σ
1986 meta-GA for optimizing GA parameters
1990 EP adopts self-adaptation of σ as ‘standard’
1990 some papers on changing parameters on-the-fly
1999 Eiben-Michalewicz-Hinterding paper proposes clear taxonomy and terminology

07-ParameterTuning/ch07-Parameters_and_Parameter_Tuning-20140.png

1.3 Taxonomy

1.4 Parameter tuning

1.5 Parameter control

1.6 Notes on parameter control

Parameter control offers the possibility to use appropriate values in various stages of the search.

Adaptive and self-adaptive control can “liberate” users from tuning,
reduces need for EA expertise for a new application.

Assumption: control heuristic is less parameter-sensitive than the EA.

BUT

State-of-the-art is a mess:
literature is a heterogeneous, no generic knowledge,
no principled approaches to developing control heuristics (deterministic or adaptive),
no solid testing methodology

WHAT ABOUT AUTOMATED TUNING?

1.7 Historical account (cont’d)

1.8 Control and information flow of EA calibration / design

07-ParameterTuning/tune-control-01.png
The whole field of EC is about this

1.9 Parameter – performance landscape

1.10 Ontology - Terminology

LOWER PART UPPER PART
METHOD EA Tuner
SEARCH SPACE Solution vectors Parameter vectors
QUALITY Fitness Utility
ASSESSMENT Evaluation Test

1.11 Off-line vs. on-line calibration / design

1.12 Tuning by generate-and-test

EA tuning is a search problem itself.
Straightforward approach: generate-and-test

1.13 Testing parameter vectors

1.14 Numeric parameters

1.15 Symbolic parameters

E.g., xover_operator, elitism, selection_method
Finite domain, e.g., {1-point, uniform, averaging}, {Y, N}
No sensible distance metric - non-searchable, must be sampled
Searchable ordering
Non-searchable ordering

1.16 Notes on parameters

1.17 What is an EA? (1/2)

ALG-1 ALG-2 ALG-3 ALG-4
SYMBOLIC PARAMETERS
Representation Bit-string Bit-string Real-valued Real-valued
Overlapping pops N Y Y Y
Survivor selection ̶ Tournament Replace worst Replace worst
Parent selection Roulette wheel Uniform determ Tournament Tournament
Mutation Bit-flip Bit-flip N(0,σ) N(0,σ)
Recombination Uniform xover Uniform xover Discrete recomb Discrete recomb
NUMERIC PARAMETERS
Generation gap ̶ 0.5 0.9 0.9
Population size 100 500 100 300
Tournament size ̶ 2 3 30
Mutation rate 0.01 0.1 ̶ ̶
Mutation stepsize ̶ ̶ 0.01 0.05
Crossover rate 0.8 0.7 1 0.8

1.18 Generate-and-test under the hood

1.19 Tuning effort

1.20 Optimize A = optimally use A

1.21 Optimize B = reduce B

1.22 Optimize A & B

Existing work:
Meta-EA with racing (Yuan & Gallagher ’04)
New trick: sharpening (Smit & Eiben 2009)
Idea: test vectors X < B times and increase X over time during the run of a population-based tuner
Newest method: REVAC with racing & sharpening = REVAC++

1.23 Which tuning method?

1.24 Tuning to study EAs themselves

07-ParameterTuning/tune-control-02.png
07-ParameterTuning/tune-control-03.png

1.25 Tuning “world champion” EAs

G-CMA-ES SaDE
Tuned by Avg St dev CEC Δ Avg St dev CEC Δ
G-CMA-ES 0.77 0.2 20 % 0.73 0.25 49 %
REVAC++ 0.85 0.24 12 % 0.67 0.22 53 %
SPOT 0.76 0.19 22 % 0.73 0.20 49 %
CEC-2005 0.97 0.32 - 1.43 0.25 -

Ranking after tuning
SaDE
CMA-ES

Ranking at CEC 2005
CMA-ES
SaDE

1.26 Tuning vs. not tuning

EA as is (accidental parameters)
EA as it can be (“optimal” parameters)

1.27 Recommendations

DO TUNE your evolutionary algorithm.
Think of the magic constants.
Decide: speed or solution quality?
Decide: specialist of generalist EA?
Measure and report tuning effort.
Try old toolbox: http://sourceforge.net/projects/mobat

1.28 Example study ‘Best parameters’

1.29 Example study ‘Good parameters’

1.30 Example study ‘interactions’

Setup: same as before.
Results: plotting the population size and generation gap of the best parameter vectors shows the following
Conclusions: for this problem the best results are obtained when (almost) the complete population is replaced every generation.
07-ParameterTuning/ch07-Parameters_and_Parameter_Tuning-20141.png

1.31 The (near) future of automated tuning

Hybrid methods for A & B
Well-funded EA performance measures, multi-objective formulation - multi-objective tuner algorithms
(Statistical) models of the utility landscape: more knowledge about parameters
Open source toolboxes
Distributed execution
Good testbeds
Adoption by the EC community
Rollout to other heuristic methods with parameters

1.32 Culture change?