Genetic algorithm
Author: Yunchen Huo (yh2244), Ran Yi (ry357), Yanni Xie (yx682), Changlin Huang (ch2269), Jingyao Tong (jt887) (ChemE 6800 Fall 2024)
Stewards: Nathan Preuss, Wei-Han Chen, Tianqi Xiao, Guoqing Hu
Introduction
Algorithm Discussion
The GA was first introduced by John H. Holland[1] in 1973. It is an optimization technique based on Charles Darwin’s theory of evolution by natural selection.

Before diving into the algorithm, here are definitions of the basic terminologies.
- Gene: The smallest unit that makes up the chromosome (decision variable)
- Chromosome: A group of genes, where each chromosome represents a solution (potential solution)
- Population: A group of chromosomes (a group of potential solutions)
GA involves the following seven steps:
- Initialization
- Randomly generate the initial population for a predetermined population size
- Evaluation
- Evaluate the fitness of every chromosome in the population to see how good it is. Higher fitness implies better solution, making the chromosome more likely to be selected as a parent of next generation
- Selection
- Natural selection serves as the main inspiration of GA, where chromosomes are randomly selected from the entire population for mating, and chromosomes with higher fitness values are more likely to be selected [2].
- Crossover
- The purpose of crossover is to create superior offspring (better solutions) by combining parts from each selected parent chromosome. There are different types of crossover, such as single-point and double-point crossover [2]. In single-point crossover, the parent chromosomes are swapped before and after a single point. In double-point crossover, the parent chromosomes are swapped between two points [2].
- Mutation
- A mutation operator is applied to make random changes to the genes of children's chromosomes, maintaining the diversity of the individual chromosomes in the population and enabling GA to find better solutions[2].
- Insertion
- Insert the mutated children chromosomes back into the population
- Repeat 2-6 until a stopping criteria is met
- Maximum number of generations reached
- No significant improvement from newer generations
- Expected fitness value is achieved
Application
GA is one of the most important and successful algorithms in optimization, which can be demonstrated by numerous applications. Applications of GA include machine learning, engineering, finance and other domain applications, the following introduces the applications of GA in Unsupervised Regression, Virtual Power Plans and Forecasting Financial Market Indices.
Unsupervised Regression
The Unsupervised regression is a promising dimensionality reduction method[3]. The concept of the Unsupervised regression is to map from low-dimensional space into the high-dimensional space by using a regression model[4]. The goal of the Unsupervised Regression is to minimize the data space reconstruction error[3]. Common optimization methods to achieve this goal are the Nearest neighbor regression and the Kernel regression[3]. Since the mapping from the low-dimensional space into the high-dimensional space is usually complex, the data space reconstruction error function may become a non-convex multimodal function with multiple local optimal solutions. In this case, using GA can overcome local optima because of the population search, selection, crossover and mutation.
Virtual Power Plants
The renewable energy resources, such as wind power and solar power, have a distinct fluctuating character[3]. To address the inconvenience of such fluctuations on the power grid, engineers have introduced the concept of virtual power plants, which bundles various different power sources into a single unit that meets specific properties[3].
The optimization goal is to minimize the absolute value of power in the virtual power plants system with a rule base[3]. This rule base is also known as the learning classifier system, which allows energy storage devices and standby power plants in the system to respond flexibly in different system states and achieve a balance between power consumption and generation[3].
Since the rule base has complexity and a high degree of uncertainty, using GA can observably evolve the action part of the rule base. For example, in the complex energy scheduling process, GA can optimize the charge/discharge strategy of the energy storage equipment and the plan of start/stop of standby power plants to ensure the balance of power consumption and generation.
Forecasting Financial Market Indices
A financial market index consists of a weighted average of the prices of individual shares that make up the market[5]. In financial markets, many traders and analysts believe that stock prices move in trends and repeat price patterns[5]. Under this premise, using Grammatical Evolution(GE) to forecast the financial market indices and enhance trading decisions is a good choice.
GE is a machine learning method based on the GA[5]. GE uses a biologically-inspired, genotype-phenotype mapping process, evolving computer program in any language[5]. Unlike encoding the solution within the genetic material in the GA, the GE includes a many-to-one mapping process, which shows the robustness[5].
While using GE to forecast financial market indices, people need to import processed historical stock price data. GE will learn price patterns in this data and generate models which can predict future price movements. These models can help traders and analysts identify the trend of the financial market, such as upward and downward trends.
Software tools and platforms that utilize Genetic Algorithms
MATLAB: The Global Optimization Toolbox of MATLAB is widely used for engineering simulations and machine learning.
Python: The DEAP and PyGAD in Python provide an environment for research and AI model optimization.
OpenGA: The OpenGA is a free C++ GA library, which is open-source. The OpenGA provides developers with a flexible way to implement GA for solving a variety of optimization problems[6].
References
- ↑ Holland, J. H. (1973). Genetic algorithms and the optimal allocation of trials. SIAM Journal on Computing, 2(2), 88–105
- ↑ Jump up to: 2.0 2.1 2.2 2.3 Mirjalili, S. (2018). Genetic Algorithm. Evolutionary Algorithms and Neural Networks, Springer, pp. 43–56