Evolutionary multimodal optimization

From Cornell University Computational Optimization Open Textbook - Optimization Wiki
Jump to navigation Jump to search

Author: Connor Clappin (cjc395) (ChemE 6800 Fall 2024)

Stewards: Nathan Preuss, Wei-Han Chen, Tianqi Xiao, Guoqing Hu

Introduction

Evolutionary Multimodal Optimization (EMO) involves the use of evolutionary algorithms (EAs) to locate and maintain multiple optimal solutions in problems with multiple optima. Traditional optimization approaches typically focus on identifying a single global optimum, often ignoring other potential solutions. EMO addresses this limitation by employing strategies to preserve diversity in the population, allowing the identification of multiple global and local optima.

Multimodal optimization problems arise in many practical domains. For example, in engineering design, alternative configurations might satisfy the same performance criteria but differ in material, cost, or other constraints. Similarly, in drug discovery, different compounds may achieve the desired therapeutic effect while varying side effects or production costs. EAs, inspired by the principles of natural selection, are well-suited to such challenges due to their stochastic nature and ability to explore complex, high-dimensional search spaces.[1]

The primary motivation for studying EMO is its ability to provide comprehensive insights into a problem’s landscape. Identifying multiple solutions offers flexibility in decision-making, enabling stakeholders to choose solutions based on secondary criteria not included in the optimization. Moreover, EMO enhances robustness by offering alternatives that might perform well under varying or uncertain conditions.[2]

Algorithm Discussion

Evolutionary algorithms are population-based optimization methods that inherently maintain a diverse set of solutions. However, standard EAs tend to converge to a single solution due to selection pressure. In multimodal optimization, preserving diversity is crucial to avoid premature convergence and to explore multiple peaks in the fitness landscape.

Diversity-Preserving Mechanisms

To prevent premature convergence and maintain diversity across the population, several strategies are employed. Below, these strategies are detailed with their respective equations.

Fitness Sharing

Fitness sharing reduces the fitness of individuals based on their proximity to others, thereby discouraging overcrowding in a single region of the search space. The adjusted fitness of an individual is calculated as: where:

  • : original fitness of individual ,
  • : total number of individuals,
  • : distance between individual and individual ,
  • : sharing function:

where is the niche radius, and is a constant controlling the shape of the sharing function. This ensures individuals in densely populated regions receive lower adjusted fitness values.[3]

Crowding and Deterministic Crowding

Crowding methods replace similar individuals, ensuring diverse genetic representation in the population.

In deterministic crowding:

1. Each offspring is compared to its most similar parent , based on a distance measure , typically defined in the genotype or phenotype space.

2. The better-performing individual between and survives:

Probabilistic crowding adds randomness to survivor selection. The probability of an offspring surviving over its parent is: [4]

Niching and Speciation

Niching divides the population into subgroups focusing on different regions of the search space, promoting independent exploration of peaks.

Speciation groups individuals based on genetic similarity. Individuals and belong to the same species if: where is the speciation threshold. Genetic similarity can be measured as: where and are the -th gene values of individuals and , and is the number of genes.[5]

Island Models

Island models split the population into separate subpopulations (islands) that evolve independently. Migration periodically allows individuals to move between islands.

1. Migration occurs every generations.

2. Migrants are selected based on their fitness, with the probability of selection: where is the fitness of individual on an island and is the local island population size.

3. Migrants replace individuals in the target island, selected based on their least fitness: [6]

Algorithm Workflow

The workflow of an EMO algorithm involves the following key steps:

  1. Initialization: Random generation of an initial population.
  2. Evaluation: Each individual’s fitness is assessed.
  3. Diversity Preservation: Application of mechanisms like fitness sharing, crowding, or island models.
  4. Selection: Individuals are selected based on adjusted fitness for breeding. Commonly used methods include:

where is the adjusted fitness of individual .

  1. Genetic Operations: Crossover and mutation are applied to generate offspring. Mutation, for example, may alter a gene with:

where is a random perturbation.

  1. Replacement: The population is updated with new individuals.
  2. Termination: The algorithm stops when a condition, such as a maximum number of generations, is reached.

Parameter Tuning and Assumptions

The effectiveness of EMO depends on proper parameter settings:

  • Niche radius : Determines the sensitivity of fitness sharing and speciation.
  • Migration rate : Balances exploration and exploitation in island models.

Assumptions:

  1. The problem landscape contains multiple optima.
  2. The algorithm can maintain sufficient diversity to explore the landscape effectively.

This structured explanation, with accompanying equations, highlights the mathematical foundation of evolutionary multimodal optimization mechanisms.

Numerical Example

Problem Statement

We aim to solve a multimodal optimization problem using the Rastrigin function, a common benchmark in optimization.[7] The function is defined as:

- Domain: - Global Minimum: at - Number of Variables: (for simplicity)

The function has multiple local minima, making it ideal for demonstrating EMO.

Figure 1: Rastrigin Function Plot

Steps in the EMO Process

Step 1: Initialize Population

We randomly generate a population of individuals within the range .

Initial Population:

Figure 2: Rastrigin Function Initial Population

Step 2: Evaluate Fitness

The fitness of each individual is calculated using the Rastrigin function:

Example calculations: - For : - For :

Fitness values for the population:

Step 3: Apply Niching (Fitness Sharing)

To maintain diversity, we apply fitness sharing, which adjusts fitness values based on proximity to other individuals.

1. Sharing Function:

2. Adjusted Fitness:

- Compute distances between individuals and adjust fitness. Example: For , calculate distances and apply the sharing function.

Figure 3: Rastrigin Function Niching Process

Step 4: Selection

Using shared fitness values, select individuals for reproduction. Methods include: - Tournament Selection - Roulette Wheel Selection

Step 5: Variation (Crossover and Mutation)

1. Crossover: Combine two parent solutions: Parent 1: Parent 2: Child:

2. Mutation: Introduce random variation:

Mutated Child:

Step 6: Replace and Iterate

Replace the worst-performing individuals with offspring and repeat steps 2–5 for several generations.

Figure 4: Rastrigin Function Evolutionary Convergence

Final Results

After several generations, the algorithm identifies multiple optima:

These represent different peaks or valleys in the Rastrigin function.

Summary

Evolutionary Multimodal Optimization successfully discovers diverse solutions by: Maintaining population diversity through niching and exploring the multimodal landscape using evolutionary operators. This process ensures identification of both global and local optima.

Applications

Evolutionary multimodal optimization has a wide range of applications across various domains. In engineering design, EMO is used to identify multiple feasible configurations of mechanical components. For instance, an aircraft wing might be designed with different materials or structural layouts that all meet aerodynamic requirements. EMO allows engineers to evaluate trade-offs between cost, weight, and manufacturability.[8]

In bioinformatics, EMO is applied to problems such as DNA sequence alignment. Multiple high-scoring alignments may have biological significance, and EMO helps discover these alternatives. By maintaining diversity, the algorithm ensures that rare but promising alignments are not overlooked.[9]

Data clustering is another prominent application. In this context, EMO is used to optimize the placement of cluster centers in unsupervised learning. By exploring multiple solutions, the algorithm identifies distinct groupings in datasets, even when the number of clusters is not predefined. [6]

Case studies highlight the practical utility of EMO. For example, in antenna design, EMO has been used to create antennas with multiple operating frequencies. By optimizing the design parameters, the algorithm identifies configurations that maximize efficiency across different frequency bands. In chemical engineering, EMO has been employed to optimize reaction conditions for processes with multiple operating points, balancing yield, cost, and safety.

Software tools like MATLAB, Python’s DEAP library, and GAlib support the implementation of EMO. MATLAB’s Global Optimization Toolbox includes built-in functions for niching and fitness sharing, while DEAP provides a flexible framework for developing custom evolutionary algorithms.

Conclusion

Evolutionary Multimodal Optimization extends traditional evolutionary algorithms to address problems requiring multiple optimal solutions. By incorporating techniques such as fitness sharing, crowding, and niching, EMO maintains diversity within the population, preventing premature convergence and enabling the exploration of multiple optima.

The key takeaways from this discussion include the importance of diversity preservation, the effectiveness of EMO in solving complex problems, and its versatility across domains. Applications in engineering, bioinformatics, and data science demonstrate its practical value, while advances in computational tools facilitate its adoption.

Future research in EMO may focus on adaptive niching parameters to enhance performance and scalability to handle high-dimensional optimization problems. Hybrid approaches combining EMO with other methods, such as local search, hold promise for further improving efficiency and solution quality.

References

  1. Deb, K., & Goldberg, D. E. (1989). An investigation of niche and species formation in genetic function optimization. In Proceedings of the 3rd International Conference on Genetic Algorithms (pp. 42–50). San Mateo, CA: Morgan Kaufmann.
  2. Deb, K. (2001). Multi-objective optimization using evolutionary algorithms. Wiley. Link
  3. Goldberg, D. E., & Richardson, J. (1987). Genetic algorithms with sharing for multimodal function optimization. In Proceedings of the Second International Conference on Genetic Algorithms (pp. 41–49). Hillsdale, NJ: L. Erlbaum Associates. Link
  4. Hansen, N., Müller, S. D., & Koumoutsakos, P. (2003). Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation, 11(1), 1–18. Link
  5. Deb, K., & Saha, A. (2009). Multimodal optimization using a bi-objective evolutionary algorithm. Evolutionary Computation, 20(1), 27–62. Link
  6. 6.0 6.1 Lin, H. T., & Lee, C. Y. (2011). Evolutionary multi-objective clustering and its applications. Expert Systems with Applications, 38(8), 9674–9683. Link
  7. Rastrigin, L. A. (1974). Systems of extremal control. Cybernetics and Systems Analysis, 12(4), 630–635. Link
  8. Coello Coello, C. A., & Lamont, G. B. (2004). Applications of multi-objective evolutionary algorithms. In Applications of Multi-Objective Evolutionary Algorithms (pp. 437–523). Springer.
  9. Wang, G. G., Deb, K., & Tan, Y. (2019). A survey on multi-modal optimization using evolutionary algorithms. Swarm and Evolutionary Computation, 44, 102–117. Link