[mlpack-git] [mlpack] Optimizer transition: Adam, Adadelta (#555)

Marcus Edel notifications at github.com
Tue Mar 8 14:06:28 EST 2016


You can basically use the RMSprop or the SGD optimizer in core/optimizers as a basis, to reimplement Adam and Adadelta. 

So, take a look at the RMSprop ann implementation  (ann/optimizer/rmsprop.hpp) and compre the implementation with the new RMSprop implementation in core/optimizers/rmsprop_impl.hpp. 

Once there is an Adam and Adadelta implementation in core/optimizers, I'll go and delete the ann/optimizer folder.


It would be great if you could start with one of the optimizers (Adam or Adadelta). Maybe someone likes to pick up the other optimizer, if not you are more than welcome to reimplement Adam and Adadelta.

---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/issues/555#issuecomment-193919410
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20160308/73beaf44/attachment.html>


More information about the mlpack-git mailing list