Heuristic algorithms

From Cornell University Computational Optimization Open Textbook - Optimization Wiki
Jump to navigation Jump to search

Author: Anmol Singh (as2753)

Steward: Fengqi You, Allen Yang

Introduction

In mathematical programming, a heuristic algorithm is a procedure that determines near-optimal solutions to an optimization problem. However, this is achieved by trading optimality, completeness, accuracy, or precision for speed. Nevertheless, heuristics is a widely used technique for a variety of reasons:

·      Problems that do not have an exact solution or for which the formulation is unknown

·      The computation of a problem is computationally intensive

·      Calculation of bounds on the optimal solution in branch and bound solution processes

Methodology

Optimization heuristics can be categorized into two broad classes depending on the way the solution domain is organized:

1.    Construction methods (Greedy algorithms)

Greedy algorithm works in phases, where the algorithm makes the optimal choice at each step as it attempts to find the overall optimal way to solve the entire problem. It is a technique used to solve the famous “travelling salesman problem” where the heuristic followed is: "At each step of the journey, visit the nearest unvisited city."

2.    Local Search methods

Local Search method follows an iterative approach where we start with some initial solution, explore the neighborhood of the current solution, and then replace the current solution with a better solution. For this method, the “travelling salesman problem” would follow the heuristic in which a solution is a cycle containing all nodes of the graph and the target is to minimize the total length of the cycle.

Popular Heuristic Algorithms

Genetic Algorithm

The term Genetic Algorithm was first used by John Holland (J.H. Holland (1975) Adaptation in Natural and Artificial Systems, University of Michigan Press, Ann Arbor, Michigan; re-issued by MIT Press (1992). They are designed to mimic the Darwinian theory of evolution, which states that populations of species evolve over time to produce organisms that are more complex and fitter for survival on Earth. Genetic algorithms operate on string structures, like biological structures, which are evolving in time according to the rule of survival of the fittest by using a randomized yet structured information exchange. Thus, in every generation, a new set of strings is created, using parts of the fittest members of the old set. (Optimal design of heat exchanger networks, Editor(s): Wilfried Roetzel, Xing Luo, Dezhen Chen, Design and Operation of Heat Exchangers and their Networks, Academic Press, 2020, Pages 231-317, ISBN 9780128178942, https://doi.org/10.1016/B978-0-12-817894-2.00006-6.) The algorithm terminates when the satisfactory fitness level has been reached for the population or the maximum generations have been reached. The typical steps are:

1.     Choose an initial population of candidate solutions

2.     Calculate the fitness, how well the solution is, of each individual

3.     Perform crossover from the population. The operation is to randomly choose some pair of the individuals as parents and exchange so parts from the parents to generate new individuals

4.     Mutation is to randomly change some individuals to create other new individuals

5.     Evaluate the fitness of the offspring

6.     Select the survive individuals

7.    Proceed from 3 if the termination criteria have not been reached

(Wang FS., Chen LH. (2013) Genetic Algorithms. In: Dubitzky W., Wolkenhauer O., Cho KH., Yokota H. (eds) Encyclopedia of Systems Biology. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-9863-7_412)

Tabu Search Algorithm

Tabu search (TS) is a heuristic algorithm created by Fred Glover (Fred Glover (1986). "Future Paths for Integer Programming and Links to Artificial Intelligence". Computers and Operations Research. 13 (5): 533–549. doi:10.1016/0305-0548(86)90048-1) using a gradient-descent search with memory techniques to avoid cycling for determining an optimal solution. It does so by forbidding or penalizing moves which take the solution, in the next iteration, to points in the solution space previously visited. The algorithm spends some memory to keep a Tabu list of forbidden moves, which are the moves of the previous iterations or moves that might be considered unwanted. A general algorithm is as follows:

1.     Select an initial solution s0S. Initialize the Tabu List L0 = ∅ and select a list tabu size. Establish k = 0.

2.     Determine the neighborhood feasibility N(sk) that excludes inferior members of the tabu list Lk.

3.     Select the next movement sk + 1 from N(Sk) or Lk if there is a better solution and update Lk + 1

4.     Stop if a condition of termination is reached, else, k = k + 1 and return to 1

(Optimization of Preventive Maintenance Program for Imaging Equipment in Hospitals, Editor(s): Zdravko Kravanja, Miloš Bogataj, Computer Aided Chemical Engineering, Elsevier, Volume 38, 2016, Pages 1833-1838, ISSN 1570-7946, ISBN 9780444634283, https://doi.org/10.1016/B978-0-444-63428-3.50310-6.)

Illustrative Example: The Classical Vehicle Routing Problem

(Glover, Fred, and Gary A Kochenberger. Handbook Of Metaheuristics. Kluwer Academic Publishers, 2003.)

Vehicle Routing Problems have very important applications in distribution management and have become some of the most studied problems in the combinatorial optimization literature. These include several Tabu Search implementations that currently rank among the most effective. The Classical Vehicle Routing Problem (CVRP) is the basic variant in that class of problems. It can formally be defined as follows. Let G = (V, A) be a graph where V is the vertex set and A is the arc set. One of the vertices represents the depot at which a fleet of identical vehicles of capacity Q is based, and the other vertices customers that need to be serviced. With each customer vertex vi are associated a demand qi and a service time ti. With each arc (vi, vj) of A are associated a cost cij and a travel time tij. The CVRP consists in finding a set of routes such that:

1.     Each route begins and ends at the depot

2.     Each customer is visited exactly once by exactly one route

3.     The total demand of the customers assigned to each route does not exceed Q

4.     The total duration of each route (including travel and service times) does not exceed a specified value L

5.     The total cost of the routes is minimized

A feasible solution for the problem thus consists in a partition of the customers into m groups, each of total demand no larger than Q, that are sequenced to yield routes (starting and ending at the depot) of duration no larger than L.

Simulated Annealing Algorithm

The Simulated Annealing Algorithm was developed by Kirkpatrick et. al. in 1983 (Kirkpatrick, S., Gelatt, C., & Vecchi, M. (1983). Optimization by Simulated Annealing. Science, 220(4598), 671-680. Retrieved November 25, 2020, from http://www.jstor.org/stable/1690046) and is based on the analogy of ideal crystals in thermodynamics. The annealing process in metallurgy can make particles arrange themselves in the position with minima potential as the temperature is slowly decreased. The Simulation Annealing algorithm mimics this mechanism and uses the objective function of an optimization problem instead of the energy of a material to arrive at a solution. A general algorithm is as follows:

1.    Fix initial temperature (T0)

2.    Generate starting point x0 (this is the best point X* at present)

3.    Generate randomly point XS (neighboring point)

4.    Accept XS as X* (currently best solution) if an acceptance criterion is met. This must be such condition that the probability of accepting a worse point is greater than zero, particularly at higher temperatures

5.    If an equilibrium condition is satisfied, go to (6), otherwise jump back to (3).

6.    If termination conditions are not met, decrease temperature according to certain cooling scheme and jump back to (1). If termination conditions are satisfied, stop calculations accepting current best value X* as final (‘optimal’) solution. (Brief review of static optimization methods, Editor(s): Stanisław Sieniutycz, Jacek Jeżowski, Energy Optimization in Process Systems and Fuel Cells (Third Edition), Elsevier, 2018, Pages 1-41, ISBN 9780081025574, https://doi.org/10.1016/B978-0-08-102557-4.00001-3.)

Particle Swarm Optimization

Example: The Knapsack Problem

References

·       Eiselt, Horst A et al. Integer Programming And Network Models. Springer, 2011.

·       Moore, Karleigh et al. "Greedy Algorithms | Brilliant Math & Science Wiki". Brilliant.Org, 2020, https://brilliant.org/wiki/greedy-algorithm/.