• Stars
    star
    101
  • Rank 338,166 (Top 7 %)
  • Language
    Kotlin
  • License
    Other
  • Created over 8 years ago
  • Updated over 5 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Genetic Algorithms for drawing images

Genetic Draw

Draw Images via genetic programming. 中文版/Chinese 日本語版/Japanese

The Algorithms

There are two algorithms used in Genetic Draw. Both algorithms demonstrate the use of Genetic Programing to evolve an image from DNA(s).

The DNA is a list of genes where each gene encodes a polygon. The polygon could be a square, circle, rectangle, ellipse, triangle, or N-vertex polygon. In addition each gene encodes the color, location (including z-index), transparency, and size of each polygon.

When a specific DNA is "expressed" I iterate over each of the genes and render the encoded polygons to an image. I do this for each of the DNAs. The rendered image is then compared to a target image. The difference between the target image and the actual image is defined as the fitness of the DNA. In this program, 0 is perfect, meaning the evolved image is exactly the same as the target image. This is more accurately defined as the error function and works just as well.

Single Parent Genetic Programming (SingleParentGeneticDraw.kt)

  1. Randomly generate a parent DNA.
  2. Make a clone of the parent DNA, randomly mutating some of it's genes. We'll call this the child DNA.
  3. Measure the fitness of the child DNA. This is done by drawing the polygons to an image and comparing it pixel-by-pixel to a target image.
  4. If the child's DNA is more fit than it's parent's DNA, then set the parent to be the child. (The parent is now irrelevant.)
  5. Repeat from step 2.

Population-Based Two Parent Genetic Programming (PopulationBasedGeneticDraw.kt)

  1. Randomly generate a population of DNAs.
  2. Measure the fitness of all of the DNAs and sort them by fitness.
  3. Optionally, allowing elitism, select the most fit to succeed to the next generation.
  4. Optionally, kill off the bottom x% (lowest fitness) of the population.
  5. Using tournament selection or stochastic acceptance determine which remaining DNAs should reproduce to make the next generation of DNAs. Reproduce enough children to repopulate the next generation.
  6. After selecting which DNAs to reproduce, via crossover, select 50% of the genes from each parent, apply random mutations to generate children DNAs. The children now become the current population.
  7. Repeat from step 2.

Select Renderings & Stats

Two versions of Bulbasaur partially evolved. Used sexual reproduction via two parents. Population used stochastic acceptance with elitism to generate next populations.

 

Two GIFs showing the evolution of a square.

GIF showing the evolution of the DataRank and Simply Measured logos.

Evolutions of Mario. The first is using a polygon rendering DNA. The second and third are using DNAs that render fixed position/sized pixels of size 4x4px and 8x8px.

Evolution of Yoshi with convergence rate.

Adding alpha channels to the polygons results in a considerable performance drop (about 7x). While adding alpha results in better results, there are options to remove the alpha channel. Below are three renderings, the first two use alpha (left 2500 genes, middle 2000 genes), and the right image demonstrates no transparency.

The canonical examples I found on the internet seem to be the evolution of Mona Lisa. The most interesting example I found was by Roger Johansson found here. Most examples I found demonstrated using triangles. I found that I had better results by mixing many shapes together. On the left is Mona Lisa evolved using rectangles and ellipses. Below, the first two evolutions demonstrate 1000 and 2000 genes containing only rectangles and ellipses, and the third using only triangles.

Mutation Rates and their effect on learning. Not normalized to generation count, you'll have to do some manual comparing. As expected, 10% performs better than 1% and 50%. I did not try a large number of intermittent values. (Note the generation count to see the faster convergence.)

On the topic of mutation, a low mutation rate means that the image converges slowly but gradually. However, there is a caveat. With a low mutation rate the algorithm is less likely to explore new search spaces which may be required to escape from a local minima. Similarly having too large of a mutation rate results in the algorithm constantly making too large of changes which result in a slowed convergence. A large mutation rate may serve well early on in the evolution, however, as time elapses, the required micro changes to finesse the model are less likely to occur. This is analogous to having too high of a learning rate in neural networks. Instead of adopting a simulated annealing-like model with a decreasing mutation rate, I implemented a random bounded mutation rate for each child produced. The results were more successful than a static mutation probability.

Our good friend Kirby, evolved using both pixel and polygon rendering DNAs.

However, one of our Kirby's didn't evolve so well. But why? It turns out that I had a bad fitness function. Specifically my fitness function compared the raw difference between raw RGB integer values between the evolved and target images. That is red is encoded in higher bits than green, and green higher than blue. This means that a difference between reds is significantly larger than differences between greens or blue and thus the genetic algorithm will perceive improvements of red being more important than blue. In other words, I introduced a bias in the fitness function. The result was random blue blotches in many of the renderings.

More of the statistics and graphs can be found in an excel file here.

And finally, a current snapshot of my profile being evolved. :)

And previous versions.

More Repositories

1

kumo

Kumo - Java Word Cloud
Java
601
star
2

bayesian_sentiment_analysis

Pragmatic & Practical Bayesian Sentiment Classifier
Kotlin
220
star
3

swift_boxxle

A mostly complete port of GameBoy's Boxxle to Swift
Swift
102
star
4

ultimate_tictactoe

Ultimate Tic Tac Toe - HTML5
JavaScript
53
star
5

java_games

Java 2D Game Engine - Building Zelda to test it (Also includes a MineSweeper Demo)
Java
31
star
6

neural_network_kotlin

Autoencoder + Deep Autoencoder + Deep Convoluted Autoencoder + Hebbian
Kotlin
18
star
7

rbm

Deep Convoluted Restricted Boltzmann Machine (Java)
Java
17
star
8

kakyll

Static Site Generator in Kotlin
Kotlin
17
star
9

struktural

API Testing Made Easy - Kotlin
Kotlin
14
star
10

cellular-automata-pokemon-types

Cellular Automata - Pokemon Type Battle Simulation
Kotlin
8
star
11

sdl_sprite

A Simple SDL Sprite library
C++
8
star
12

haskell_boxxle

Gameboy's Boxxle in Haskell
Haskell
7
star
13

neuralnetwork

多数中間層のニューラルネットのクラス(誤差逆伝播法): multi-center layer neural network (back-error propagation algorithm)
Java
4
star
14

legend_of_mu

Legend of Mu (Playdate Game)
Lua
3
star
15

gdx-controller

LibGDX Controller Library
Java
3
star
16

c_gameoflife

John Conway's Game of life C++ (SDL)
C++
3
star
17

watchgpt

WatchGPT
Swift
3
star
18

supportvectormachine

Support Vector Machine/サポートベクターマシン/支持向量机
Perl
3
star
19

chatgpt

Programs written by ChatGPT
Python
2
star
20

game_of_life_nes

NES Game of Life
C++
2
star
21

log4php

PHP Logger
PHP
2
star
22

blockchain

Simple BlockChain in Kotlin
Kotlin
2
star
23

art

digital art
Kotlin
2
star
24

blocks

A Tetris Clone + added difficulty
PureBasic
2
star
25

soroban

Calculator Library using a Pratt Parser for MixFix grammar
Java
2
star
26

mandelbrot

Mandelbrot generator in Kotlin
Kotlin
2
star
27

vulcan

Vulcan - Space Shooter - Purebasic
PureBasic
2
star
28

games_classic

Collection of various games' source code
2
star
29

photo_filter

Some basic Photo filtering algorithms
Java
2
star
30

fleschkincaid

Flesch-Kincaid Reading Index
Java
2
star
31

pipes

start of a "data flow" programming language based around data flowing through "pipes"
Java
2
star
32

ml

Machine Learning - Collection of my personal user machine learning libraries/experiments
Terra
2
star
33

breakout

Breakout - Purebasic
PureBasic
2
star
34

catuskoti_automata

catuskoti cellular automata
Kotlin
1
star
35

minesweeper

Simple Minesweeper game in C
C++
1
star
36

antlr_dna

A Sample DNA grammar written using Antlr
Java
1
star
37

trie_kotlin

Prefix Tree / Radix Tree / Trie in Kotlin
Kotlin
1
star
38

linux_asm

My Assembly Collection
Assembly
1
star
39

selforganizingmap

Self-Organizing Map/自己組織化写像/自組織映射網路
Java
1
star
40

jGames

JQuery plugin for displaying various board games
JavaScript
1
star
41

euler

Project Euler (85 solutions) - Java
Java
1
star
42

resource_optimization

Resource Optimization via Monte-Carlo
Kotlin
1
star
43

minesweeper_libgdx

A Mine Sweeper implementation using LibGDX.
Java
1
star
44

snakes

Snakes - Purebasic
PureBasic
1
star