1 07-ParameterTuning


Which parameters are best to initialize with?

2 Parameters and Parameter Tuning

3 Brief historical account

1970/80ies “GA is a robust method”
1970ies + ESs self-adapt mutation step size σ
1986 meta-GA for optimizing GA parameters
1990ies EP adopts self-adaptation of σ as ‘standard’
1990ies some papers on changing parameters on-the-fly
1999 Eiben-Michalewicz-Hinterding paper proposes clear taxonomy and terminology

07-ParameterTuning/ch07-Parameters_and_Parameter_Tuning-20140.png

4 Taxonomy

5 Parameter tuning

6 Parameter control

7 Notes on parameter control

Parameter control offers the possibility to use appropriate values in various stages of the search

Adaptive and self-adaptive control can “liberate” users from tuning,
reduces need for EA expertise for a new application

Assumption: control heuristic is less parameter-sensitive than the EA

BUT

State-of-the-art is a mess:
literature is a heterogeneous, no generic knowledge,
no principled approaches to developing control heuristics (deterministic or adaptive),
no solid testing methodology

WHAT ABOUT AUTOMATED TUNING?

8 Historical account (cont’d)

9 Control and information flow of EA calibration / design

07-ParameterTuning/tune-control-01.png
The whole field of EC is about this

10 Parameter – performance landscape

11 Ontology - Terminology

LOWER PART UPPER PART
METHOD EA Tuner
SEARCH SPACE Solution vectors Parameter vectors
QUALITY Fitness Utility
ASSESSMENT Evaluation Test

12 Off-line vs. on-line calibration / design

13 Tuning by generate-and-test

EA tuning is a search problem itself.
Straightforward approach: generate-and-test

14 Testing parameter vectors

15 Numeric parameters

16 Symbolic parameters

E.g., xover_operator, elitism, selection_method
Finite domain, e.g., {1-point, uniform, averaging}, {Y, N}
No sensible distance metric - non-searchable, must be sampled
Searchable ordering
Non-searchable ordering

17 Notes on parameters

18 What is an EA? (1/2)

ALG-1 ALG-2 ALG-3 ALG-4
SYMBOLIC PARAMETERS
Representation Bit-string Bit-string Real-valued Real-valued
Overlapping pops N Y Y Y
Survivor selection ̶ Tournament Replace worst Replace worst
Parent selection Roulette wheel Uniform determ Tournament Tournament
Mutation Bit-flip Bit-flip N(0,σ) N(0,σ)
Recombination Uniform xover Uniform xover Discrete recomb Discrete recomb
NUMERIC PARAMETERS
Generation gap ̶ 0.5 0.9 0.9
Population size 100 500 100 300
Tournament size ̶ 2 3 30
Mutation rate 0.01 0.1 ̶ ̶
Mutation stepsize ̶ ̶ 0.01 0.05
Crossover rate 0.8 0.7 1 0.8

19 Generate-and-test under the hood

20 Tuning effort

21 Optimize A = optimally use A

22 Optimize B = reduce B

23 Optimize A & B

Existing work:
Meta-EA with racing (Yuan & Gallagher ’04)
New trick: sharpening (Smit & Eiben 2009)
Idea: test vectors X < B times and increase X over time during the run of a population-based tuner
Newest method: REVAC with racing & sharpening = REVAC++

24 Which tuning method?

25 Tuning to study EAs themselves

07-ParameterTuning/tune-control-02.png
07-ParameterTuning/tune-control-03.png

26 Tuning “world champion” EAs

G-CMA-ES SaDE
Tuned by Avg St dev CEC Δ Avg St dev CEC Δ
G-CMA-ES 0.77 0.2 20 % 0.73 0.25 49 %
REVAC++ 0.85 0.24 12 % 0.67 0.22 53 %
SPOT 0.76 0.19 22 % 0.73 0.20 49 %
CEC-2005 0.97 0.32 - 1.43 0.25 -

Ranking after tuning
SaDE
CMA-ES

Ranking at CEC 2005
CMA-ES
SaDE

27 Tuning vs. not tuning

EA as is (accidental parameters)
EA as it can be (“optimal” parameters)

28 Recommendations

DO TUNE your evolutionary algorithm.
Think of the magic constants.
Decide: speed or solution quality?
Decide: specialist of generalist EA?
Measure and report tuning effort.
Try old toolbox: http://sourceforge.net/projects/mobat

29 Example study ‘Best parameters’

30 Example study ‘Good parameters’

31 Example study ‘interactions’

Setup: same as before.
Results: plotting the population size and generation gap of the best parameter vectors shows the following
Conclusions: for this problem the best results are obtained when (almost) the complete population is replaced every generation.
07-ParameterTuning/ch07-Parameters_and_Parameter_Tuning-20141.png

32 The (near) future of automated tuning

Hybrid methods for A & B
Well-funded EA performance measures, multi-objective formulation - multi-objective tuner algorithms
(Statistical) models of the utility landscape: more knowledge about parameters
Open source toolboxes
Distributed execution
Good testbeds
Adoption by the EC community
Rollout to other heuristic methods with parameters

33 Culture change?