1. Initiate the evolutionary process

Start

2. Adjust the delay between generations

3. Change which color is the fittest

The closer each color's bits are to this color, the more likely it will survive.

4. Observe average fitness over time

Colors matching the fittest color exactly have 24 out of 24 bits in common.

Waiting for evolution to begin...
Keyboard shortcuts: space bar to start/stop · ← → to change the fittest color · ⌘/ctrl + or - to adjust speed

The Evolution of Color

This Emergent Mind project, created by Matt Mazur in June 2014, uses a genetic algorithm to evolve a population of colors. You select which color is the "fittest," and the population evolves toward it in real time. It is a hands-on demonstration of how natural selection works at the level of bits and bytes.

How It Works

A standard RGB color is made up of exactly three bytes (24 bits). Red, for example, is 11111111 00000000 00000000 in binary, or #FF0000 in hex. Yellow is 11111111 11111111 00000000, or #FFFF00.

When you load the page, the population consists of randomly generated colors. Each one has a random 24-bit chromosome. The "fittest" color is set to red by default, but you can change it at any time. The closer a color's bits are to the fittest color's bits, the more likely it is to survive to the next generation.

Hover over any color to see its binary representation, the fittest color's binary representation, and how many of its 24 bits match. This count is its fitness score.

The Genetic Algorithm

The genetic algorithm used here was adapted from the one described by Melanie Mitchell in An Introduction to Genetic Algorithms.

Here's how it works:

  1. Initialization. Fill the grid with randomly generated 24-bit chromosomes (the colors). Each square in the grid is one member of the population.
  2. Fitness evaluation. For each color, count how many of its 24 bits match the corresponding bits in the fittest color's chromosome. A perfect match scores 24.
  3. Selection. Choose two parent colors from the population using fitness-proportionate selection (roulette wheel sampling). Each color gets a slice of the wheel proportional to its fitness, so fitter colors are more likely to be chosen. Selection is done with replacement, so the same color can be picked more than once. The parents can come from anywhere in the grid.
  4. Crossover. With probability 0.7, cross the two parents at a random bit position to produce two offspring. If crossover does not occur, the offspring are copies of their parents.
  5. Mutation. Each bit in each offspring has a 1-in-1,000 chance of flipping.
  6. Replacement. Repeat steps 3 through 5 until there are enough offspring to fill the entire grid. Then discard the old population and replace it with the new one. A color's position in the grid has no significance; the grid is just a way to display the full population at once. This completes one generation.
  7. Repeat. Go back to step 2. The process continues indefinitely until you stop it.

Why It's Interesting

The population never perfectly converges on the fittest color because mutation and crossover keep introducing variation. This is exactly how it works in nature: even in a stable environment, genetic variation persists because of the randomness inherent in reproduction.

The most dramatic moments happen when you change the fittest color mid-run. If you swap all the bits (say, from black 000000 to white FFFFFF), the average fitness plummets. Colors that were well-adapted to the old target are suddenly poorly adapted to the new one. Over generations, the population recovers as colors closer to the new target outcompete the rest.

This mirrors what happens when a real environment changes abruptly. If the average global temperature rose ten degrees over a short period, organisms adapted to the cooler climate would find themselves poorly suited. Over time, those that survived would pass on their genes, and the average fitness of the population would climb back up.

The Fitness Sparkline

The sparkline in the controls tracks the population's average fitness over the last 150 generations. When the environment is stable, the line climbs quickly and plateaus near (but not at) 24. When you change the fittest color, you can see the line drop and then recover. The steepness of the recovery depends on how different the new target is from the old one.

Real-World Applications

Genetic algorithms like this one are used in engineering optimization (antenna design, circuit layout), scheduling problems, machine learning hyperparameter tuning, and procedural content generation in games. The core idea, that a population of candidate solutions can improve through selection, crossover, and mutation, turns out to be surprisingly effective for problems where the search space is too large to explore exhaustively.

Originally published June 16, 2014