Original article was published by Rakib Ansari on Deep Learning on Medium
A simpler approach to understand genetic algorithm with example in a fun way.
Before understanding genetic algorithm, lets understand ‘what is optimization technique ?’
Optimization Technique :
Optimization techniques are the techniques that are used to discover the best solution out of all the possible solutions available under the constraints present.
Now understand ‘Genetic Algorithm’ :
Genetic algorithm is one such optimization algorithm that is built on the basis of the natural evolutionary process of our nature.
- They are commonly used to generate high-quality solutions for optimization problems and search problems.
- Genetic algorithms simulate the process of natural selection which means those species who can adapt to changes in their environment are able to survive and reproduce and go to next generation.
- The genetic algorithm is based on the genetic structure and behavior of the chromosome of the population.
Foundation of genetic algorithm :
- Each chromosome indicates a possible solution. Thus the population is a collection of chromosomes.
- Each individual in the population is characterized by a fitness function. Greater fitness better is the solution.
- Out of the available individuals in the population, the best individuals are used for the reproduction of the next generation off-springs.
- The offspring produced will have features of both the parents and is a result of mutation. A mutation is a small change in the gene structure.
Life Cycle Of Genetic Algorithm :
- Initialization of Population :
Every gene represents a parameter (variables) in the solution. This collection of parameters that forms the solution is the chromosome. The population is a collection of chromosomes. Order of genes on the chromosome matters.
2. Fitness Function :
Out of the available chromosomes, we have to select the best ones for the reproduction of off-springs, so each chromosome is given a fitness value. The fitness score helps to select the individuals which will be used for reproduction.
3. Selection :
The main goal of this phase is to find the region where the chances of getting the best solution are more.
4. Reproduction :
Generation of off-springs happen in 2 ways:
- Crossover : A random point is selected while mating a pair of parents to generate off-springs.
- Mutation : It happens to take care of diversity among the population and stop premature convergence.
5. Convergence (when to stop) :
- When there is no improvement in the solution.
- When a hard and fast range of generations and time is reached.
- Till an acceptable solution is obtained.
So, the genetic algorithm works like :
- The algorithm begins by creating a random initial population.
2. The algorithm then creates a sequence of new populations. At each step, the algorithm uses the individuals in the current generation to create the next population. To create the new population, the algorithm performs the following steps:
- Scores each member of the current population by computing its fitness value. These values are called the raw fitness scores.
- Scales the raw fitness scores to convert them into a more usable range of values. These scaled values are called expectation values.
- Selects members, called parents, based on their expectation.
- Some of the individuals in the current population that have lower fitness are chosen as elite. These elite individuals are passed to the next population.
- Produces children from the parents. Children are produced either by making random changes to a single parent — mutation — or by combining the vector entries of a pair of parents — crossover.
- Replaces the current population with the children to form the next generation.
3. The algorithm stops when one of the stopping criteria is met.
To concrete our understanding, lets understand it with more realistic way :
Open this link below and spend 30 seconds to just watch it:
In this application, we have
to build a simple car model to traverse this terrain.
Put some number of wheels on it somewhere, add a set of triangles as a chassis, and off you go The farther it goes, the better the car is, and the goal is to design the best car you possibly can.
First, the algorithm will try some random solutions, and, as it has no idea about the concept of a car or gravity, it will create a lot of bad solutions that don’t work at all. However, after a point, it will create something
that is at least remotely similar to a car, which will immediately perform so much better than the other solutions in the population. A genetic algorithm then creates a new set of solutions, however, now, not randomly.
It respects a rule that we call: survival of the fittest. Which means the best existing solutions are taken and mixed together to breed new solutions that are also expected to do well. Like in evolution in nature, mutations
can also happen, which means random changes are applied to the DNA code of a solution. We know from nature that evolution works extraordinarily
well, and the more we run this genetic optimization program, the better the solutions get.
Application of Genetic Algorithm :
- Traveling and Shipment Routing − Traveling salesman problem is one of the major application of the genetic algorithm. For example, when a trip planner is asked to plan a trip, he would take the help of a genetic algorithm which not only helps to reduce the overall cost of the trip but also in reducing the time.GE is also used for planning the delivery of products from place to place in the best efficient way.
2. Neural Networks − GAs are also used to train neural networks, particularly recurrent neural networks.
3. Image Processing − GAs are used for various digital image processing (DIP) tasks as well like dense pixel matching.
4. Robot Trajectory Generation − GAs have been used to plan the path which a robot arm takes by moving from one point to another.
5. Vehicle routing problems − With multiple soft time windows, multiple depots and a heterogeneous fleet.
- Genetic algorithms do not scale well with complexity.
2. The “better” solution is only in comparison to other solutions. As a result, the stop criterion is not clear in every problem.
3. Operating on dynamic data sets is difficult, as genomes begin to converge early on towards solutions which may no longer be valid for later data.
4. GAs cannot effectively solve problems in which the only fitness measure is a single right/wrong measure (like decision problems), as there is no way to converge on the solution.
5. GAs have a tendency to converge towards local optima or even arbitrary points rather than the global optimum of the problem. This means that it does not “know how” to sacrifice short-term fitness to gain longer-term fitness.
Optimization techniques are the techniques that are used to discover the best solution out of all the possible solutions available under the constraints present. So the Genetic algorithm is one such optimization algorithm that is built on the basis of the natural evolutionary process of our nature. The idea of Natural Selection and Genetic Inheritance is used here. It uses guided random search, unlike other algorithms, i.e., finding the optimal solution by starting with a random initial cost function and then searching only in the space which had the least cost (in the guided direction). Suitable when you are working with huge and complex datasets.