[mlpack-git] [mlpack] improve speed of SparseAutoencoder and make it more flexible (#451)

Marcus Edel notifications at github.com
Sun Dec 6 17:17:17 EST 2015


>If we could store the PerformanceFunction, this problem could be solved.Rather than calling static Error function, call it like``

You are right. The suggested change looks great to me.


>The other question is, do you have any plans to provide move constructor and move assignments for ann modules? ``

I haven't had the time yet to provide a move constructor, but it's on my list. If you like, to provide some code, I'll be happy to merge the changes. But don't feel obligated.

>The performance function should be able to accept the reference of RhoCap of hiddenLayer, because both of the hiddenLayer and the performance function need to use it.

If we pass the Weights and the RhoCap parameter we have to handle the pass of the parameter inside of the fnn class, if we pass an instance of the network, we can handle the parameter inside the performance function.

If we implement a getter function for the ``RhoCap`` parameter inside the new ``SparseLinearLayer``, we don't need to provide a special constructor. We can update the parameter inside the error function once we call the performance function. What do you think about the idea?

---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/pull/451#issuecomment-162353769
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20151206/3ae3320e/attachment.html>


More information about the mlpack-git mailing list