[mlpack-git] [mlpack/mlpack] NeuralEvolution - implemented gene, genome (#686)

Excalibur notifications at github.com
Tue Jun 7 16:54:01 EDT 2016


> +
> +      // Loop neurons to calculate neurons' activation.
> +      for (unsigned in j=aNumInput; j<aNeuronGenes.size(); ++j) {
> +        double x = aNeuronGenes[j].aInput;  // TODO: consider bias. Difference?
> +        aNeuronGenes[j].aInput = 0;
> +
> +        double y = 0;
> +        switch (aNeuronGenes[j].Type()) { // TODO: revise the implementation.
> +          case SIGMOID:                   // TODO: more cases.
> +            y = sigmoid(x);
> +            break;
> +          case RELU:
> +            y = relu(x);
> +            break;
> +          default:
> +            y = sigmoid(x);

Do you think it is a good idea to make our implementation of activation functions the same with 
https://github.com/mlpack/mlpack/tree/master/src/mlpack/methods/ann/activation_functions
but keep it in our own ne folder? So that if the functions in ann module changes, it won't affect the ne module. Besides, we can add more functions in ne module without change ann code.

---
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/pull/686/files/3c8aa62b951f029b3883e9baef1ea556ef5af2d3#r66150677
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20160607/4f9cbed5/attachment.html>


More information about the mlpack-git mailing list