Simulated annealing: Difference between revisions

From Cornell University Computational Optimization Open Textbook - Optimization Wiki
Jump to navigation Jump to search
No edit summary
No edit summary
Line 15: Line 15:
The basic steps of the SA algorithm are as follows:
The basic steps of the SA algorithm are as follows:


# Initialization: Begin with an initial solution $$s_0$$ and an initial temperature $$T_0$$​.
# Initialization: Begin with an initial solution <math>s_0</math> and an initial temperature <math>T_0
# Iterative Improvement:
</math>​.
# Termination:
# Iterative Improvement: While the stopping criteria are not met:
#* Generate a new solution <math>s_{new}</math> ​in the neighborhood of the current solution s.
#* Calculate the difference in objective values <math>\Delta E = f(s_{new}) - f(s)</math>, where <math>f</math> is the objective function.
#* If <math>\Delta E <0</math> (i.e., <math>s_{new}</math> is a better solution), accept <math>s_{new}</math>​ as the current solution.
#* If <math>\Delta E >0</math>, accept <math>s_{new}</math>​ with a probability <math>P = \exp\left(-\frac{\Delta E}{T}\right)</math>.
#* Update the temperature <math>T
</math> according to a cooling schedule, typically <math>T=\alpha \cdot T
</math>, where <math>\alpha
</math> is a constant between 0 and 1.
# Termination: Stop the process when the temperature  falls below a minimum threshold , or after a predefined number of iterations. The best solution encountered is returned as the approximate optimum.
 
=== Pseudocode for Simulated Annealing ===
SimulatedAnnealing(f, s_0, T_0, \alpha, T_{min})
    s_current = s_0
    T = T_0
    while T > T_{min}:
        s_new = GenerateNeighbor(s_current)
        \Delta E = f(s_new) - f(s_current)
        if \Delta E < 0 or Random(0, 1) < e^{-\Delta E / T}:
            s_current = s_new
        T = \alpha \times T
    return s_current

Revision as of 18:27, 12 December 2024

Author: Gwen Zhang (xz929), Yingjie Wang (yw2749), Junchi Xiao (jx422), Yichen Li (yl3938), Xiaoxiao Ge (xg353) (ChemE 6800 Fall 2024)

Stewards: Nathan Preuss, Wei-Han Chen, Tianqi Xiao, Guoqing Hu

Introduction

Simulated annealing (SA) is a probabilistic optimization algorithm inspired by the metallurgical annealing process, which reduces defects in a material by controlling the cooling rate to achieve a stable state.[1] The core concept of SA is to allow algorithms to escape the constraints of local optima by occasionally accepting suboptimal solutions. This characteristic enables SA to find near-global optima in large and complex search spaces.[2] During the convergence process, the probability of accepting a suboptimal solution diminishes over time.

SA is widely applied in diverse fields such as scheduling,[3] machine learning, and engineering design, and is particularly effective for combinatorial optimization problems that are challenging for deterministic methods.[4] First proposed in the 1980s by Kirkpatrick, Gelatt, and Vecchi, SA demonstrated its efficacy in solving various complex optimization problems through its analogy to thermodynamics.[5] Today, it remains a powerful heuristic often combined with other optimization techniques to enhance performance in challenging problem spaces.

Algorithm Discussion

Formal Description of the Algorithm

SA is a probabilistic technique for finding approximate solutions to optimization problems, particularly those with large search spaces. Inspired by the annealing process in metallurgy, the algorithm explores the solution space by occasionally accepting worse solutions with a probability that diminishes over time. This reduces the risk of getting stuck in local optima and increases the likelihood of discovering the global optimum.

The basic steps of the SA algorithm are as follows:

  1. Initialization: Begin with an initial solution and an initial temperature ​.
  2. Iterative Improvement: While the stopping criteria are not met:
    • Generate a new solution ​in the neighborhood of the current solution s.
    • Calculate the difference in objective values , where is the objective function.
    • If (i.e., is a better solution), accept ​ as the current solution.
    • If , accept ​ with a probability .
    • Update the temperature according to a cooling schedule, typically , where is a constant between 0 and 1.
  3. Termination: Stop the process when the temperature falls below a minimum threshold , or after a predefined number of iterations. The best solution encountered is returned as the approximate optimum.

Pseudocode for Simulated Annealing

SimulatedAnnealing(f, s_0, T_0, \alpha, T_{min})
    s_current = s_0
    T = T_0
    while T > T_{min}:
        s_new = GenerateNeighbor(s_current)
        \Delta E = f(s_new) - f(s_current)
        if \Delta E < 0 or Random(0, 1) < e^{-\Delta E / T}:
            s_current = s_new
        T = \alpha \times T
    return s_current
  1. Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H., & Teller, E.(1953). "Equation of state calculations by fast computing machines." Journal of Chemical Physics, 21(6), 1087-1092.
  2. Kirkpatrick, S., Gelatt, C. D., & Vecchi, M. P. (1983). "Optimization by simulated annealing." Science, 220(4598), 671-680.
  3. Aarts, E. H., & Korst, J. H. (1988). Simulated Annealing and Boltzmann Machines: A Stochastic Approach to Combinatorial Optimization and Neural Computing. Wiley.
  4. Cerny, V. (1985). "Thermodynamical approach to the traveling salesman problem: An efficient simulation algorithm." Journal of Optimization Theory and Applications, 45(1), 41-51.
  5. Kirkpatrick, S. (1984). "Optimization by simulated annealing: Quantitative studies." Journal of Statistical Physics, 34(5-6), 975-986.