Evolutionary Computing_

Explore how genetic algorithms can help machines learn

Natural Selection.

Organisms in nature have one goal in life - to survive. Their ability to adapt to changes in their environment increase this chance, and in turn, make them much more likely to have offspring and pass on beneficial DNA. We can observe this in any natural environment, such as polar bears developing a thick layer of skin to survive in the cold, and even in bacteria becoming resistant to certain types of antibiotics. This ability to adapt is known as Darwin's theory of evolution by natural selection - a concept we can use to make a machine learn.

By using the same principles that organisms undergo in natural selection, we can apply this to programming concepts that will help us solve a multitude of problems. In fact, the idea of using genetic algorithms based on natural selection is so good at solving problems that they can be applied to many complex and large datasets - such as those found within the field of artificial intelligence and machine learning!

A Human Analogy.

To help understand the ideas behind natural selection in computing, let's start with humans as an example to help unpack it. We first start off with a population, let's say, one hundred people. Each person in the population desires to mate with someone who has green eyes, brown hair, and can run very fast. A fitness value can then be applied to each member of the population, which is simply a number that indicates how close to the desired set of attributes they are. Someone who has brown hair, for example, will have a higher fitness value than someone with blonde hair.

Individuals in the population that have a higher fitness value are much more likely to breed with one another and produce offspring which have the desired traits. This means that beneficial DNA is passed on to the next generation, and repeats over-and-over until everyone has green eyes, brown hair, and can run very fast! The goal of the population is now complete.

Use in Computation.

The same principles that we observe in nature can also be applied to computational problems. And in general, to solve any problem with a genetic algorithm, we require six steps:

1. Initialise a population - This is often randomly generated and may be small or large in size.
2. Evaluate the population - Individuals in this group are measured on how close they are to the solution, known as their fitness.
3. Apply selection - Individuals with the highest fitness values are more likely to be selected from the group, and in turn pass on their DNA.
4. Crossover - Create new individuals from the selected parents by merging their DNA.
5. Mutation - A random chance that DNA in the child will mutate.
6. Repeat - cycle through until the solution is found!

Example: Missiles!

To put all of this information into perspective, we shall use a worked example involving some intelligent 'missiles'. These missiles have one goal - to reach the target at the end of the map in the quickest time possible. So, following the steps discussed, our initial population will consist of one-hundred missiles. Each one can apply a boost in any direction, and store this movement as its DNA. This gives us one-hundred randomly generated agents that each store their own genetic information.

Fitness Values.

Once we launch the agents, we shall let them run for a given amount of time and measure how close to the target they come (a.k.a their fitness). Agents who come close to the mark will have a value near to one, and agents who are far away will have a value near to zero. Individuals who make it closer to the target are then much more likely to mate with other agents, and should produce children that get better at hitting the target each generation.

Selection and Crossover.

To ensure that child rockets gain traits from each parent, we must ensure that the DNA is spliced from both strands. Many forms of splicing exist, however, we shall be selecting a random portion of DNA from each parent and combining this information into a new child. This process repeats until a whole new population is created.

Mutation and Termination.

The final steps of the genetic algorithm now involve mutation and the final termination once the solution has been found. Mutation, in this case, is rather simple. We can apply a small percentage chance (typically less than one-percent) that any strand of DNA will randomly change direction. This prevents missiles from converging on a path that is not optimal and ensures enough variance in the DNA to find a solution faster. Termination, on the other hand, will take place when all of the rockets have found their way to the target. Once this has happened, we can say that their goal is complete - they all have all survived.

Video Example.

The following video shows our missiles learning to hit the target. This simulation can be accessed at the end of the tutorial by downloading the program.

Conclusion.

Our missile analogy is just one simple way of demonstrating genetic algorithms working in the field of machine learning. However, the basic principles of applying this theory never change. In more complex applications of the algorithm, we still perform the six steps discussed; the only difference is the problem we are trying to solve, along with what the actual population and individuals consist of. It is recommended that you continue to read further on this subject, and we encourage you to download the simulation in the next slide.

Downloads and Links.

Want to learn more? Download the processing sketch below for the full 2D and 3D simulations.

Processing Demo (2D + 3D) MAC Processing Demo (2D + 3D) WIN