[mlpack-git] [mlpack] Add hard tanh layer (#540)
Marcus Edel
notifications at github.com
Wed Mar 2 09:39:26 EST 2016
The hard tanh function is sometimes preferred over the tanh function since it is computationally cheaper. It does however saturate for magnitudes of x greater than 1. The activation of the hard tanh is:
```
f(x) = max, if x > max,
f(x) = min, if x < min,
f(x) = x, otherwise.
```
It would be great, if mlpack would provide an implementation of a ```HardTanHLayer``` that allows the specification of the min and max value.
```
HardTanHLayer(const double minValue = -1, const double maxValue = 1)
```
---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/issues/540
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20160302/5f95978d/attachment.html>
More information about the mlpack-git
mailing list