[mlpack] Getting Introduced to the Community

Marcus Edel marcus.edel at fu-berlin.de
Sun Mar 1 11:25:42 EST 2015


Hello Prakhar,

> I really enjoyed reading about Bidirectional Neural networks and Dropout.

Yeah, it's a really nice idea to consider past and the future context existing
in the data.

> I would like to start by trying to implement Dropout for mlpack and get back
> with issues as and when required. 

You can find more information about how to tackle the problem in the issue #413.
In addition, I've added some deep learning related issues and the ideas pages now
points to related open tickets.

Hopefully this is helpful. Feel free to respond if you have any questions.

Thanks,
Marcus

> On 28 Feb 2015, at 18:44, PRAKHAR AGARWAL <f2012277 at pilani.bits-pilani.ac.in> wrote:
> 
> Hi Marcus,
> 
> Thanks for the resources!
> I really enjoyed reading about Bidirectional Neural networks and Dropout.
> I would like to start by trying to implement Dropout for mlpack and get back with issues as and when required. 
> So I would request you to add some simple tasks which might be helpful for me to get familiar with the internal neural network architecture for mlpack ?
> 
> Thanks,
> 
> Prakhar
> 
> On Wed, Feb 25, 2015 at 4:40 AM, Marcus Edel <marcus.edel at fu-berlin.de <mailto:marcus.edel at fu-berlin.de>> wrote:
> Hi Prakhar,
> 
> Unfortunately there aren't any deep learning related issues. I will spend some
> time over the weekend to add some simple tasks which might be helpful to get
> familiar with the internal neural network architecture.
> 
> Since you're familiar with recurrent networks maybe you are interested in adding
> bidirectional recurrent networks:
> 
> "Bidirectional Recurrent Neural Networks"
> http://www.di.ufpe.br/~fnj/RNA/bibliografia/BRNN.pdf <http://www.di.ufpe.br/~fnj/RNA/bibliografia/BRNN.pdf>
> This is the first paper, which introduces the idea of building a bidirectional
> recurrent network. Maybe you might find it interesting.
> 
> Another interesting and simple task could be to implement dropout:
> 
> "Dropout: A Simple Way to Prevent Neural Networks from Overfitting"
> http://www.cs.toronto.edu/~rsalakhu/papers/srivastava14a.pdf <http://www.cs.toronto.edu/~rsalakhu/papers/srivastava14a.pdf>
> An interesting technique to tackle overfitting. The key idea is to randomly drop
> units during training.
> 
> If you'd like we can discuss more on this.
> 
> Thanks,
> Marcus
> 
>> On 24 Feb 2015, at 17:47, Ryan Curtin <ryan at ratml.org <mailto:ryan at ratml.org>> wrote:
>> 
>> On Tue, Feb 24, 2015 at 03:33:41PM +0530, PRAKHAR AGARWAL wrote:
>>> Hi,
>>> 
>>> My name is Prakhar Agarwal and I would like to introduce myself to the
>>> developers of this community. I'm (technically) a junior at Birla Institute
>>> of Technology and Science Pilani. I am well versed with Python, C/C++,
>>> Machine Learning, PHP and bash. I have been using the mlpack and now I'm
>>> comfortable with it.
>>> 
>>> I was very much fascinated with the concept of Deep Learning  and wanted to
>>> explore it more and what could be a better way to start off than writing an
>>> algorithm and seeing it in action.
>>> 
>>> I was having a look at the GSoC 2015 ideas
>>> <https://github.com/mlpack/mlpack/wiki/SummerOfCodeIdeas <https://github.com/mlpack/mlpack/wiki/SummerOfCodeIdeas>> page and the
>>> particular project of Essential Deep Learning Modules
>>> <https://github.com/mlpack/mlpack/wiki/SummerOfCodeIdeas#essential-deep-learning-modules <https://github.com/mlpack/mlpack/wiki/SummerOfCodeIdeas#essential-deep-learning-modules>>.
>>> I
>>> have some basic information about Belief networks and Recurrent networks
>>> and have worked on artificial neural networks. I have also tried building
>>> mlpack from source and writing basic mlpack programs.
>>> 
>>> Could you please guide me through where should I look forward that would be
>>> helpful for the project?
>> 
>> Hi Prakhar,
>> 
>> Since you are interested in neural networks, I would spend some time
>> taking a look at the code that Marcus and Shangtong have written, both
>> in src/mlpack/methods/ann/ and in pull request #405.  Other than that,
>> you might consider taking a look through the list of open issues on
>> Github and seeing if any of them are interesting to you.  They are
>> labeled with difficulty, so this should help in finding ones that are
>> easier for people who aren't intricately familiar with the internals of
>> mlpack.
>> 
>> Hope that helps -- if not, please feel free to ask more questions. :)
>> 
>> Thanks,
>> 
>> Ryan
>> 
>> -- 
>> Ryan Curtin    | "Open the pig!"
>> ryan at ratml.org <mailto:ryan at ratml.org> |   - Frank Moses
>> _______________________________________________
>> mlpack mailing list
>> mlpack at cc.gatech.edu <mailto:mlpack at cc.gatech.edu>
>> https://mailman.cc.gatech.edu/mailman/listinfo/mlpack <https://mailman.cc.gatech.edu/mailman/listinfo/mlpack>
> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack/attachments/20150301/4f7da515/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: smime.p7s
Type: application/pkcs7-signature
Size: 5136 bytes
Desc: not available
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack/attachments/20150301/4f7da515/attachment-0001.bin>


More information about the mlpack mailing list