[mlpack-git] [mlpack] Optimizer transition: Adam, Adadelta (#555)

Marcus Edel notifications at github.com
Mon Mar 7 18:23:33 EST 2016


We recently decided to use the optimizer in core/optimizers for the network modules instead of writing special optimizer just for the network modules. However, we already implemented some methods that aren't available for the rest of mlpack like Adam, Adadelta and RMSprop (methods/ann/optimizer). Actually, that's not true, we reimplemented RMSprop in da1207a9e6407e835 and made the optimizer available for the rest of mlpack. It should be fairly easy to reimplement the other two methods as well.

---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/issues/555
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20160307/49f3feaa/attachment-0001.html>


More information about the mlpack-git mailing list