Neural Net Optimize w/ Genetic Algorithm

A good example of neural networks and genetic programming is the NEAT architecture (Neuro-Evolution of Augmenting Topologies). This is a genetic algorithm that finds an optimal topology. It's also known to be good at keeping the number of hidden nodes down.

They also made a game using this called Nero. Quite unique and very amazing tangible results.

Dr. Stanley's homepage:

http://www.cs.ucf.edu/~kstanley/

Here you'll find just about everything NEAT related as he is the one who invented it.


Genetic algorithms can be usefully applied to optimising neural networks, but you have to think a little about what you want to do.

Most "classic" NN training algorithms, such as Back-Propagation, only optimise the weights of the neurons. Genetic algorithms can optimise the weights, but this will typically be inefficient. However, as you were asking, they can optimise the topology of the network and also the parameters for your training algorithm. You'll have to be especially wary of creating networks that are "over-trained" though.

One further technique with a modified genetic algorithms can be useful for overcoming a problem with Back-Propagation. Back-Propagation usually finds local minima, but it finds them accurately and rapidly. Combining a Genetic Algorithm with Back-Propagation, e.g., in a Lamarckian GA, gives the advantages of both. This technique is briefly described during the GAUL tutorial


Actually, there are multiple things that you can optimize using GA regarding NN. You can optimize the structure (number of nodes, layers, activation function etc.). You can also train using GA, that means setting the weights.

Genetic algorithms will never be the most efficient, but they usually used when you have little clue as to what numbers to use.

For training, you can use other algorithms including backpropagation, nelder-mead etc..

You said you wanted to optimize number hidden nodes, for this, genetic algorithm may be sufficient, although far from "optimal". The space you are searching is probably too small to use genetic algorithms, but they can still work and afaik, they are already implemented in matlab, so no biggie.

What do you mean by optimizing amount of training done? If you mean number of epochs, then that's fine, just remember that training is somehow dependent on starting weights and they are usually random, so the fitness function used for GA won't really be a function.