[mlpack] GSOC 2014: Introduction

Udit Saxena saxena.udit at gmail.com
Tue Feb 25 12:34:05 EST 2014


Hi,

This was regarding the idea of implementing Adaboost.

I have started looking up a few papers on Adaboost implementations. most of
which involve Schapire and Freund.

I was going through the list of methods implemented by mlpack, and believe
the some weak learners have to be implemented also.
I imagine the list of tasks would be something similar to :

   - implementing a few weak learners:
      - Alternating decision trees
      - C4.5/C5: note C5 also includes boosting options
      - something simple like weighted linear least squares
      - some controlled version of random forests ( unlikely, this one)
   - the basic adaboost algorithm is quite susceptible to noise and
   outliers, and a good goal would be to focus on "gentle adaboost"
   - also, the adaboost.m1, .m2, are also a good goal for implementing
   multiclass classification.

So as you can see,I'd welcome suggestions for variants of weak learners, as
most of mine are boosting decision tree based. I am reading a paper on this
too.

Also there are wide variety of adaboost algorithms based on extensions:
logitboost, mpboost, icsiboost. I guess we will be coming up with one of
our own, specific to mlpack, but just to post a few ideas.

Who might be a potential mentor for this project/idea ?

Going through last year's list, I am also interested in packaging mlpack in
debian and ubuntu. I think it could be clubbed with this idea for a
summer's worth of coding.

Thanks.

--
--
Udit Saxena
Student, BITS Pilani
India
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack/attachments/20140225/0471366b/attachment.html>


More information about the mlpack mailing list