[mlpack-git] [mlpack] Add leaky ReLUs (#412)

Marcus Edel notifications at github.com
Mon Feb 29 16:04:15 EST 2016


You are right this is a good starting point to get familiar with the code.

The BaseLayer only works with activation functions that can be called without any additional parameters like the sigmoid or the tanh function. Since the leaky rectified linear function uses the leakyness factor as an additional parameter you can't use the BaseLayer to call the function. But there is an easy solution you can directly implement the ``LeakyReLULayer`` without implementing the activation function in ``ann/activation_functions`` first. The ``LeakyReLULayer`` should have the same functions as SoftmaxLayer but should allow the specification of the leakyness factor in the constructor.

Please leave a comment if something doesn't make sense.

---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/issues/412#issuecomment-190391536
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20160229/8821c108/attachment-0001.html>


More information about the mlpack-git mailing list