<p>You can directly implement the <code>HardTanhLayer</code>. The extra activation functions (e.g. the <code>tanh</code> function) are supposed to be used in combination with the <code>BaseLayer</code>. The <code>BaseLayer</code> only works with activation functions that can be called without any additional parameters like the sigmoid or the <code>tanh</code> function. Since the <code>HardTanhLayer</code> should allow the specification of the min and max value as additional parameter you can't use the <code>BaseLayer</code>. The <code>HardTanhLayer</code> should have the same functions as <code>SoftmaxLayer</code>.</p>

<p style="font-size:small;-webkit-text-size-adjust:none;color:#666;">&mdash;<br>Reply to this email directly or <a href="https://github.com/mlpack/mlpack/issues/540#issuecomment-191285253">view it on GitHub</a>.<img alt="" height="1" src="https://github.com/notifications/beacon/AJ4bFO82zeyVIdImtJ9iiiv4czFaCnv4ks5ppar4gaJpZM4HnfY2.gif" width="1" /></p>
<div itemscope itemtype="http://schema.org/EmailMessage">
<div itemprop="action" itemscope itemtype="http://schema.org/ViewAction">
  <link itemprop="url" href="https://github.com/mlpack/mlpack/issues/540#issuecomment-191285253"></link>
  <meta itemprop="name" content="View Issue"></meta>
</div>
<meta itemprop="description" content="View this Issue on GitHub"></meta>
</div>