[mlpack-git] [mlpack] Added LeakyReLU (and hardtanh) activation layer and its test. (#544)
Dhawal Arora
notifications at github.com
Thu Mar 3 08:48:32 EST 2016
Yeah, i figured that out. I coded considering both the possibilities. If it ends up in the layers, i think less change might be needed. It has the forward and the backward passes :)
---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/pull/544#issuecomment-191766341
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20160303/d37cad0a/attachment.html>
More information about the mlpack-git
mailing list