[mlpack] GSoc : Parallel Stochastic Optimization Methods - Ideas
Ryan Curtin
ryan at ratml.org
Tue Mar 15 09:11:34 EDT 2016
On Tue, Mar 15, 2016 at 01:01:51PM +0530, Srinivas Kumar wrote:
> Hi Ryan,
>
> My name is Srinivas and I'm a Masters student at the Supercomputer
> Education Research Center (India).
>
> I've worked with different frameworks like OpenMP, MPI, Hadoop and CUDA . I
> also have good knowledge in C++, JAVA and Python.
>
> I'm interested in the "Parallel Stochastic Optimization Methods" project
> that you have offered.
>
> I realized what you are actually looking for is to fine-tune the
> implementation of SGD for multi-core rather than extend it to distributed
> memory systems.
>
> I came across a paper by L ́eon Bottou titled "Stochastic Gradient Descent
> Tricks"
>
> Link to paper : http://research.microsoft.com/pubs/192769/tricks-2012.pdf
> Link to code : https://github.com/npinto/bottou-sgd
>
> It offers a nice set of recommendations for implementing SGD. I would like
> to know if it would be a good idea to use these recommendations to
> implement and fine-tune SGD for *mlpack* ?
>
> I would like to work on mlpack as a GSoC project. Please let me know what
> you think about this idea.
Hi Srinivas,
You're right that we are looking to extend the optimizers we have
implemented to the multi-core case instead of going fully distributed.
You are more than welcome to submit a proposal to implement these tricks
(or just implement the tricks and submit a PR for them). When you
prepare your proposal, though, you may want to consider more work than
just this paper, because I think maybe this paper will not be enough
work for a whole summer.
Thanks,
Ryan
--
Ryan Curtin | "Moo."
ryan at ratml.org | - Eugene Belford
More information about the mlpack
mailing list