[mlpack] Neuroevolution Algorithms Implementation - Week 3
bang liu
lbldtb at gmail.com
Mon Jun 13 22:27:55 EDT 2016
Dear all,
During the past week 3, I have finished the implementation of CNE algorithm
and tested it by XOR test. The result shows that it works quite well and
converges (error=0) within 500 iterations.
During the progress of implementing the CNE algorithm, we implemented the
following classes:
1. LinkGene
2. NeuronGene
3. Genome
4. Population
5. Parameters
6. Tasks
7. CNE
Basically, the means of different classes can be understood intuitively
from its name.
The key idea includes:
1. Define different tasks classes which implements a EvalFitness(Genome&
genome)
method. As it is task specific. In this way, we can easily apply the CNE
algorithm to different
kinds of tasks.
2. Define an algorithm by implementing functions such as
InitPopulation()
Reproduce()
Evolve()
Then when we use the algorithm, it is kind of such a progress:
1. Set parameters for CNE, such as population size, mutate rate, etc.
2. Set a seed genome for CNE, which is a initial neural network genome to
initialize
the population we will evolve.
3. Set task type, as it decides what kind of EvalFitness function we will
take during
evolution.
4. Create an instance of CNE (such as cne) based on parameters, task, seed
genome.
5. Evolve by cne.Evolve().
An example can be found in the ne_test.cpp.
In this week, we will start implementing the NEAT algorithm and its tests.
Best wishes,
Bang
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack/attachments/20160613/4d1af150/attachment.html>
More information about the mlpack
mailing list