Résumé : Stochastic Local Search algorithms (SLS) are a class of methods used to tacklehard combinatorial optimization problems. Despite not providing, in most cases, any guarantee on the quality of the final solution, they are often able to produce high quality solutions in a relatively short time. They are thereforeroutinely used in countless real world applications, and their wide applicabilitygenerates in turn a lot of research interest, both from a theoretical perspectiveand with applications on countless problems.There are several research lines that not only propose SLS algorithms to solve problems, but that also try to understand how SLS methods work,and to explain the results obtained by an algorithm.All these works are however focused on specific problems and instances,or address one particular point of view, and it is therefore difficult toconnect the information provided. Hence, despite the amount of research, how exactly SLS algorithms workis still an open research question. In this thesis I take an experimental approach to study how one ofthe most popular SLS algorithms, simulated annealing (SA), works. I note that many instantiations of SA have been proposed to solvedifferent problems, implementing new ideas over the original formulation,and that other algorithms appearing in the SLS literature with a differentname can be represented in the simulated annealing structure.Thus, I collect these ideas into an algorithmic framework, classifying them by thespecific role they have in the algorithm. We use automatic algorithmconfiguration tools to automatically improve the behaviour of existing SA algorithmsfor a set of problems, and to automatically generate new SA algorithmsable to outperform the existing ones. By analyzing the composition of the resulting algorithms, I observe the different characteristics that high quality SA algorithms need to exhibit to successfully tackledifferent problems.I then investigate the relationship between the algorithm structure anda scenario, to understand how the characteristics of problems and instancesimpact the performance of SA algorithms. I consider two SA variants andevaluate them on a variety of scenarios. I then measure the conditions encountered by the algorithms, and relate them to the results obtainedby the algorithms. I observe that a fixed-temperature variant ofsimulated annealing works well when the structure of the neighbourhoodsis similar in different areas of the search space. The traditional coolingsimulated annealing is instead a better option when the conditions to escapepath towards locally optimal solutions change for different regions ofthe landscape.Finally, I propose a causal framework that models the interplay between theelements involved in the solution of an optimization problem.I show how under this framework it is possible to represent several researchdirections, disambiguating the questions addressed by various works.I also show how the clarification of the relationships between problems, instances,algorithms and results is useful to approach open problems. We providea proof of concept of a transfer learning approach for algorithm configurationsthat works across different problems.