par Mascia, Franco ;Birattari, Mauro ;Stützle, Thomas
Référence Learning and Intelligent Optimization, 7th International Conference, LION 7, Catania, Italy, January 7-11, 2013, Revised Selected Papers, Springer, Berlin / Heidelberg, Vol. 7997, page (410-422)
Publication Publié, 2013
Publication dans des actes
Résumé : Tuning stochastic local search algorithms for tackling large instances is difficult due to the large amount of CPU-time that testing algorithm configurations requires on such large instances. We define an experimental protocol that allows tuning an algorithm on small tuning instances and extrapolating from the obtained configurations a parameter setting that is suited for tackling large instances. The key element of our experimental protocol is that both the algorithm parameters that need to be scaled to large instances and the stopping time that is employed for the tuning instances are treated as free parameters. The scaling law of parameter values, and the computation time limits on the small instances are then derived through the minimization of a loss function. As a proof of concept, we tune an iterated local search algorithm and a robust tabu search algorithm for the quadratic assignment problem. © 2013 Springer-Verlag.