Genetic Algorithm Back
Evolutionary Computation

Genetic Algorithm Optimizer

Select a benchmark function and configure population size, mutation rate, and crossover rate. Watch the population converge to the global optimum.

Parameters

Parameter A50
Parameter B25

About

Select a benchmark function and configure population size, mutation rate, and crossover rate. Watch the population converge to the global optimum.

Result 1
Result 2

What is a Genetic Algorithm?

🧑‍🎓
What exactly is a Genetic Algorithm? It sounds like biology, not engineering.
🎓
Basically, it's a search algorithm inspired by natural evolution. You start with a random "population" of candidate solutions. Then, you repeatedly select the best ones, "mate" them to create new solutions, and apply random "mutations." Over generations, the population evolves toward an optimal solution. Try running the simulator above with the default settings to see a population spread out and then converge.
🧑‍🎓
Wait, really? So the sliders for "Population Size" and "Mutation Rate" control this artificial evolution? What happens if I set the mutation rate to zero?
🎓
Great question! If you set mutation to zero, you lose a key source of new genetic material. The algorithm might get stuck on a local optimum—a good but not the best solution—because it can't explore new areas of the "fitness landscape" you see in the plot. For instance, on the Rastrigin function, which has many local minima, a low mutation rate might cause premature convergence to a suboptimal point. Try it and watch the population stop improving.
🧑‍🎓
That makes sense. So what's the "Crossover Rate" for? And how do I know which benchmark function to test on, like Sphere vs. Rosenbrock?
🎓
Crossover is the "mating" process, where you combine parts of two parent solutions to create a child. The rate controls how often this happens versus just copying parents. A common case is using a high crossover rate (e.g., 0.8) to aggressively combine good traits. As for the functions, each tests the GA differently. The Sphere function is smooth and simple—good for seeing basic convergence. The Rosenbrock function has a long, flat valley, testing the algorithm's ability to navigate tricky terrain. Switch between them and see how the search behavior changes!

Physical Model & Key Equations

The core of a Genetic Algorithm is the selection process, which determines which solutions survive to reproduce. A common method is "Fitness-Proportionate Selection," where the probability of selecting an individual is proportional to its fitness score relative to the whole population.

$$P(\text{select individual }i) = \frac{f_i}{\sum_{j=1}^{N} f_j}$$

Here, $f_i$ is the fitness of individual $i$, and $N$ is the population size. A higher fitness means a better solution (e.g., a lower function value for minimization). This drives the population toward better regions of the search space.

Another key operation is mutation, which introduces random variations to maintain diversity and explore new solutions. For a real-valued parameter $x$, a simple mutation adds Gaussian noise.

$$x' = x + \mathcal{N}(0, \sigma)$$

Here, $x'$ is the mutated parameter, and $\mathcal{N}(0, \sigma)$ is a random number from a normal distribution with mean 0 and standard deviation $\sigma$, which is often related to the mutation rate you control in the simulator. This small random jump can help escape local optima.

Real-World Applications

Aerodynamic Design: Genetic Algorithms are used to optimize the shape of aircraft wings or turbine blades. Engineers define parameters describing the shape, and the GA searches for the configuration that minimizes drag or maximizes lift, evaluating thousands of designs computationally.

Antenna Design: Designing antennas for specific radiation patterns is highly complex. GAs can optimize the layout and dimensions of antenna elements to meet target frequency and gain specifications, a task difficult for traditional methods.

Robotics & Motion Planning: GAs help optimize the gait of walking robots or the trajectory of robotic arms. The algorithm searches for joint angle sequences that minimize energy consumption or complete a task fastest while avoiding obstacles.

Financial Portfolio Optimization: In finance, GAs can search for the optimal mix of assets (stocks, bonds) that maximizes expected return for a given level of risk, navigating a complex landscape of historical data and constraints.

Common Misconceptions and Points to Note

As you become familiar with genetic algorithms (GAs) using this simulator, you might be tempted to see them as a "universal optimization tool." First, GAs do not guarantee finding the "optimal solution." They are a method for finding a "very good solution" with high probability. In practical applications, especially, the computational cost can become enormous. For example, optimizing a vehicle's crash safety might require several hours for a single CAE analysis. With a population of 100 and 50 generations, a simple calculation suggests 5000 analyses would be needed, which is often impractical. In practice, GAs are frequently used for rough exploration to determine an initial design or for generating initial solutions for problems where other methods provide no leads.

Next, avoid relying on magical parameter settings. Discard superstitions like "a mutation rate of 0.01 is the golden rule." The optimal parameters vary greatly depending on the problem. What you should try with this tool is observing how an extremely high mutation rate (e.g., 0.5) causes the population to perform a random walk and fail to converge, while an excessively low rate (e.g., 0.001) causes a loss of initial diversity, trapping the search in the first local optimum it encounters. In practice, you'll need to adjust parameters, starting with default values, then increasing the mutation rate if diversity is lost too quickly, or strengthening the selection pressure if convergence is too slow.

Related Engineering Fields

The concept of "multimodal function optimization" behind this simulator connects directly to many advanced engineering fields beyond CAE. For instance, in aircraft wing shape optimization, GAs are used to simultaneously satisfy conflicting objectives like lift and drag (multi-objective optimization). If the numerous control points on the wing surface are design variables, the flow field they create forms a complex response surface, much like the Rastrigin function.

Furthermore, in automotive lightweight design, GAs are applied to minimize weight by treating beam cross-sectional shapes and plate thicknesses of the body frame as variables, with stiffness and natural frequencies as constraints. The "terrain" here is a high-dimensional hypersurface woven from the results of multiple CAE simulations representing crash safety and vibration characteristics. Moreover, in the field of materials informatics, the combinations of molecular compositions or processing conditions for new materials are treated as genes, and GAs are used to search for materials with desired properties. Thus, any engineering design problem where the relationship between design variables and performance metrics is a "black box" and many local optima exist is a potential application for the principles you learn with this tool.

For Further Learning

Once you're comfortable with this simulator, the next step is to experience the "curse of dimensionality." This tool uses two variables (x and y), but real-world problems can involve tens to hundreds of variables. As variables increase, the search space expands exponentially, overwhelming a simple GA. For learning, try to mentally visualize a landscape with three variables. Furthermore, learning about constrained optimization serves as a bridge to practical application. An example constraint is "minimize weight, but deflection must be under 5mm." Handling such constraints requires techniques like penalty function methods or fitness ranking approaches.

Regarding mathematical background, understanding the Schema Theorem deepens your grasp of the algorithm's essence. This theoretical pillar of GAs states that "genetic patterns (schemata) with short defining lengths and above-average fitness increase exponentially over generations." This explains how GAs perform a constructive search by combining good gene blocks, rather than a purely random search. For your next topic, I recommend comparing GA variants like Evolution Strategies (ES) and Particle Swarm Optimization (PSO). Although both are "swarm intelligence" methods, PSO uses a model where particles fly towards their own best position and the swarm's best position, exhibiting different search dynamics than GA. Understanding these differences helps develop your judgment in selecting the appropriate optimization method for a given problem.