Heuristic algorithms: Difference between revisions

From Cornell University Computational Optimization Open Textbook - Optimization Wiki
Jump to navigation Jump to search
No edit summary
No edit summary
Line 28: Line 28:


=== Simulated Annealing Algorithm ===
=== Simulated Annealing Algorithm ===
The Simulated Annealing Algorithm was developed by Kirkpatrick et. al. in 1983 (Kirkpatrick, S., Gelatt, C., & Vecchi, M. (1983). Optimization by Simulated Annealing. ''Science,'' ''220''(4598), 671-680. Retrieved November 25, 2020, from <nowiki>http://www.jstor.org/stable/1690046</nowiki>) and is based on the analogy of ideal crystals in thermodynamics. The annealing process in metallurgy can make particles arrange themselves in the position with minima potential as the temperature is slowly decreased. The Simulation Annealing algorithm mimics this mechanism and uses the objective function of an optimization problem instead of the energy of a material to arrive at a solution. A general algorithm is as follows:
1.    Fix initial temperature (''T''<sup>0</sup>)
2.    Generate starting point '''x'''<sup>0</sup> (this is the best point '''''X'''''<sup>*</sup> at present)
3.    Generate randomly point '''''X<sup>S</sup>''''' (neighboring point)
4.    Accept '''''X<sup>S</sup>''''' as '''''X'''''<sup>*</sup> (currently best solution) if an acceptance criterion is met. This must be such condition that the probability of accepting a worse point is greater than zero, particularly at higher temperatures
5.    If an equilibrium condition is satisfied, go to (6), otherwise jump back to (3).
6.    If termination conditions are not met, decrease temperature according to certain cooling scheme and jump back to (1). If termination conditions are satisfied, stop calculations accepting current best value '''''X'''''<sup>*</sup> as final (‘optimal’) solution. (Brief review of static optimization methods, Editor(s): Stanisław Sieniutycz, Jacek Jeżowski, Energy Optimization in Process Systems and Fuel Cells (Third Edition), Elsevier, 2018, Pages 1-41, <nowiki>ISBN 9780081025574</nowiki>, <nowiki>https://doi.org/10.1016/B978-0-08-102557-4.00001-3</nowiki>.)


=== Particle Swarm Optimization ===
=== Particle Swarm Optimization ===

Revision as of 03:43, 25 November 2020

Author: Anmol Singh (as2753)

Steward: Fengqi You, Allen Yang

Introduction

In mathematical programming, a heuristic algorithm is a procedure that determines near-optimal solutions to an optimization problem. However, this is achieved by trading optimality, completeness, accuracy, or precision for speed. Nevertheless, heuristics is a widely used technique for a variety of reasons:

·      Problems that do not have an exact solution or for which the formulation is unknown

·      The computation of a problem is computationally intensive

·      Calculation of bounds on the optimal solution in branch and bound solution processes

Methodology

Optimization heuristics can be categorized into two broad classes depending on the way the solution domain is organized:

1.    Construction methods (Greedy algorithms)

Greedy algorithm works in phases, where the algorithm makes the optimal choice at each step as it attempts to find the overall optimal way to solve the entire problem. It is a technique used to solve the famous “travelling salesman problem” where the heuristic followed is: "At each step of the journey, visit the nearest unvisited city."

2.    Local Search methods

Local Search method follows an iterative approach where we start with some initial solution, explore the neighborhood of the current solution, and then replace the current solution with a better solution. For this method, the “travelling salesman problem” would follow the heuristic in which a solution is a cycle containing all nodes of the graph and the target is to minimize the total length of the cycle.

Applications

Genetic Algorithm

Tabu Search Algorithm

Simulated Annealing Algorithm

The Simulated Annealing Algorithm was developed by Kirkpatrick et. al. in 1983 (Kirkpatrick, S., Gelatt, C., & Vecchi, M. (1983). Optimization by Simulated Annealing. Science, 220(4598), 671-680. Retrieved November 25, 2020, from http://www.jstor.org/stable/1690046) and is based on the analogy of ideal crystals in thermodynamics. The annealing process in metallurgy can make particles arrange themselves in the position with minima potential as the temperature is slowly decreased. The Simulation Annealing algorithm mimics this mechanism and uses the objective function of an optimization problem instead of the energy of a material to arrive at a solution. A general algorithm is as follows:

1.    Fix initial temperature (T0)

2.    Generate starting point x0 (this is the best point X* at present)

3.    Generate randomly point XS (neighboring point)

4.    Accept XS as X* (currently best solution) if an acceptance criterion is met. This must be such condition that the probability of accepting a worse point is greater than zero, particularly at higher temperatures

5.    If an equilibrium condition is satisfied, go to (6), otherwise jump back to (3).

6.    If termination conditions are not met, decrease temperature according to certain cooling scheme and jump back to (1). If termination conditions are satisfied, stop calculations accepting current best value X* as final (‘optimal’) solution. (Brief review of static optimization methods, Editor(s): Stanisław Sieniutycz, Jacek Jeżowski, Energy Optimization in Process Systems and Fuel Cells (Third Edition), Elsevier, 2018, Pages 1-41, ISBN 9780081025574, https://doi.org/10.1016/B978-0-08-102557-4.00001-3.)

Particle Swarm Optimization

Example: The Knapsack Problem

References

·       Eiselt, Horst A et al. Integer Programming And Network Models. Springer, 2011.

·       Moore, Karleigh et al. "Greedy Algorithms | Brilliant Math & Science Wiki". Brilliant.Org, 2020, https://brilliant.org/wiki/greedy-algorithm/.