<blockquote>
<p>If we pass the Weights and the RhoCap parameter we have to handle the pass of the parameter inside of the fnn class</p>
</blockquote>

<p>Do you mean rather than</p>

<pre><code>SparseErrorFunction(hiddenLayer.Weights(), outputLayer.Weights(), hiddenLayer.RhoCap(), 
                    beta, lambda)
</code></pre>

<p>Pass whole network into it would be better?</p>

<pre><code>SparseErrorFunction(network, beta, lambda)
</code></pre>

<p>I have no problem with this one, if you think it is ok I would start to implement it.</p>

<blockquote>
<p>If you like, to provide some code, I'll be happy to merge the changes. But don't feel obligated.</p>
</blockquote>

<p>Ok, would try to implement it if I got time</p>

<p style="font-size:small;-webkit-text-size-adjust:none;color:#666;">&mdash;<br>Reply to this email directly or <a href="https://github.com/mlpack/mlpack/pull/451#issuecomment-162486551">view it on GitHub</a>.<img alt="" height="1" src="https://github.com/notifications/beacon/AJ4bFCPDA0IT4gfVuomQTTQU9-5BhUy_ks5pNWDfgaJpZM4GAqt4.gif" width="1" /></p>
<div itemscope itemtype="http://schema.org/EmailMessage">
<div itemprop="action" itemscope itemtype="http://schema.org/ViewAction">
  <link itemprop="url" href="https://github.com/mlpack/mlpack/pull/451#issuecomment-162486551"></link>
  <meta itemprop="name" content="View Pull Request"></meta>
</div>
<meta itemprop="description" content="View this Pull Request on GitHub"></meta>
</div>