Original article was published on Artificial Intelligence on Medium
NeuroEvolution of Augmenting Topologies (NEAT)
NEAT stands for NeuroEvolution of Augmenting Topologies, which is a genetic algorithm designed to efficiently evolve artificial neural network topologies. It’s an awesome technique that addressed some challenges of Topology and Weight Evolving Artiﬁcial Neural Networks (TWEANN). Before we go deeper into NEAT, let’s take a look at two key concepts: Artificial Neural Network (ANN) and Genetic Algorithm (GA).
What is an artificial neural network?
An artificial neural network (ANN) is a brain-inspired computing system that consists of interconnected nodes and weighted connections. A typical ANN architecture contains an input layer, some hidden layers, and an output layer. Each node in the hidden layer and the output layer consists of an activation function that transforms the weighted sum of input values to output. In most cases, an ANN can learn by properly adjusting its connection weights through backpropagation, a technique that performs gradient methods to minimize loss.
If you would like to learn more about ANN, here are some resources that may be helpful:
What is a genetic algorithm?
“We see beautiful adaptation everywhere and in every part of the organic world.” — Charles Darwin, On the Origin of Species.
In 1859, a theory of natural selection introduced by Charles Darwin lay the first stone of evolutionary biology. The essence of this theory is that those most adaptable to change are most likely to survive. Since then, the notion of “Survival of the Fittest” provided a new way for people to understand species evolution.
If biological evolution is able to generate amazing species like human beings and many others, is it possible to combine modern genetics with machine learning techniques to solve a problem? Genetic algorithms open the door of opportunity and shine the way to possibilities.
By definition, a genetic algorithm is a heuristic technique that simulates the process of natural selection to solve a vast variety of optimization problems, especially those with a nondifferentiable or stochastic objective function.
Here are some of the key terms:
- Fitness function: In many cases, a fitness function is the same as an objective function. It is a function that takes the solution as input and generates a fitness score as output to evaluate the quality of each solution. It is an essential element of a GA. We should tailor the fitness function based on each specific problem.
- Population: In GA, the population is the subset of all the candidate solutions to a given problem.
- Chromosome: Each candidate solution in the population is a chromosome, sometimes referred to as a genome.
- Gene: Each element position within a chromosome is a gene, which has a specific value and determines the genotype and the phenotype of the solution.
- Generation: At each iteration, GA performs a set of genetic operators (selection, crossover, mutation, etc.) on the current population to generate a successive generation.
- Selection: Selection is the process of filtering and retaining solutions according to fitness score. Solutions with higher fitness are more likely to be pushed forward into the next generation. A pair of selected solutions could be chosen as parents to breed and propagate their genes to the subsequent generation via crossover.
- Crossover: Crossover is the way parent chromosomes produce child chromosomes. Genes of parents are reassembled based on a set of crossover points to formulate new offspring.
- Mutation: Mutation in GA is a small random tweak of the genes in chromosomes. It allows GA to explore the solution space and avoid falling into local minima.
Here is a demonstration of how a typical GA works:
In the beginning, we randomly generate N solutions as our initiated population. We then move on to the evaluation stage, where we assign tasks and calculate the fitness score for each solution. Next, we determine whether to terminate the literation based on the following conditions: Did we obtain an acceptable solution? Did we reach the time or generation limit? Are we stuck in the performance stagnation? If no, we will proceed and conduct genetic operators to reproduce offspring for the next generation.
To help illustrate the workflow, here is the “Hello World” problem for GA, the Knapsack Problem.
Imagine that you are given five items with different weights and values. You want to maximize the total value of the items included in your bag. However, your capacity is 15 kg. The optimal collection in this example may seem intuitive to you, but let’s see how a GA approaches this problem.
In this case, we define the fitness function as the summation of the value of each item included in our bag. In other words, we believe a collection is better if its total value is higher. Besides, the total weight should not exceed our capacity. At the beginning of the iteration, we randomly generate four solutions (population) as our first generation. For each solution (chromosome), we denote whether an item is included (“1”) or not included (“0′), representing a gene. Each proposed solution has a fitness score. The top-scored solutions are more likely to be chosen as parents (selection). The two selected chromosomes exchange some of their genes based on crossover points (crossover). Moreover, there is a small probability that some genes of the new offspring will change (mutation). Finally, a new generation is born. The loop will continue until termination requirements are met.
Now that we have some basic knowledge of ANN and GA, let’s dive into NEAT!
What is the NeuroEvolution of Augmenting Topologies?
In short, NEAT is a GA designed to evolve ANN. One big question you may have is what makes NEAT special? I believe the answer to this question can be found in this brilliant 6-page paper written by Kenneth Stanley in 2002. The following discussion is based on this original paper.
The process of evolving an ANN using GA instead of backpropagation is also known as Neuroevolution (NE). The beauty of NEAT is that it renders solutions to tackle three main challenges for NE:
- Is there a genetic representation that allows disparate topologies to crossover in a meaningful way?
- How can topological innovation that needs a few generations to optimize be protected so that it does not disappear from the population prematurely?
- How can topologies be minimized throughout evolution without the need for a specially contrived ﬁtness function that measures complexity?
Here are the key concepts that underpin NEAT:
- Genetic Encoding: Genetic encoding is the process of representing the chromosomes in a GA. NEAT uses a direct encoding scheme that each genome consists of a list of node genes and a list of connection genes. Node genes indicate all the input, hidden, and output nodes that can be connected, while connection genes store the connection information and an innovation number for historical markings. By doing so, NEAT can line up the corresponding genes quickly during the crossover process.
- Growth: NEAT grows and evolves the topology by structural mutation. Mutation can be adding a new node in an old connection or adding a new connection between two unconnected nodes. Therefore, NEAT is able to improve genome diversity, explore solution space, and avoid local minima.
- Historical Markings: Historical markings is the process of tracking genes. NEAT uses a global innovation number that represents a chronology of a gene in the system to perform historical markings. A new innovation number will be assigned to that gene whenever a new node or connection is created. During crossover, the offspring randomly choose genes with the same innovation number (matching genes) from either parent and inherit those with different innovation number (disjoint or excess genes) from the more ﬁt parent. Accordingly, NEAT ensures the crossover is meaningful and resolves competing convention, a situation that parents with a similar phenotype produce damaged offspring.
- Speciation: A topological innovation usually comes with reduced fitness and requires time to optimize its performance. However, if competing with the overall population directly, a new topology is likely to be eliminated before it reaches its best fitness. This is why speciation plays a critical role in NEAT. NEAT measures a compatibility distance between two genomes. Compatibility is calculated by a linear combination of the average weight differences of matching genes and the number of excess and disjoin genes. Genomes are clustered into different species based on a compatibility threshold. By doing so, each genome competes with those within the same niche. Therefore, a new species will be protected.
- Minimizing Dimensionality: NEAT always begins with a uniform population and without any hidden nodes. A new structure is introduced by structural mutation and protected by speciation. Then fitness evaluation determines whether the innovation is useful. So, NEAT increases the complexity only when needed and thus decreases training time.
Here is an overview of the dependencies among these important components of NEAT:
To sum up, NEAT (1) utilizes historical markings to tack genes and avoid competing conventions, so that a meaningful crossover could happen and less topological analysis is required; (2) separates the population into species based on compatibility distance, so that competition is primarily within the same niche and innovation is protected; (3) starts from the simplest structure and grows only as necessary through mutation, so that it’s faster to find a solution.
Now we know how NEAT works, let’s see how to use it.