[mlpack-git] [mlpack] Add hard tanh layer (#540)

Marcus Edel notifications at github.com
Wed Mar 2 10:21:28 EST 2016


You can directly implement the ```HardTanhLayer```. The extra activation functions (e.g. the ```tanh``` function) are supposed to be used in combination with the ```BaseLayer```. The ```BaseLayer``` only works with activation functions that can be called without any additional parameters like the sigmoid or the ```tanh``` function. Since the ```HardTanhLayer``` should allow the specification of the min and max value as additional parameter you can't use the ```BaseLayer```. The ```HardTanhLayer``` should have the same functions as ```SoftmaxLayer```.



---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/issues/540#issuecomment-191285253
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20160302/1849f61a/attachment.html>


More information about the mlpack-git mailing list