[mlpack-git] [mlpack] implement command line programs of softmaxRegression (#466)

stereomatchingkiss notifications at github.com
Wed Nov 4 02:53:36 EST 2015


>then just pass a big arma::mat& instead of arma::vec&s one at a time

why arma::vec& but not arma::Row<size_t>?

>Just L-BFGS and SGD for now, but more optimizers may be implemented later.

To work with SGD, I have to implement three more functions

    size_t NumFunctions() const ;
    double Evaluate(const arma::mat& parameters, const size_t i);
    double Gradient(const arma::mat &data, const arma::mat& parameters, const size_t i);

I do not familiar with SGD, correctly me if I am wrong

It is same as mini-batch, but the batch size of sgd always equal to 1

I impement mini-batch before, the principals are

    //begin and end are random number
    Evaluate(data.submat(0, begin, data.n_rows-1, end), parameters);
    Gradient(data.submat(0, begin, data.n_rows-1, end), parameters);

The Evaluate and Gradient functions still do the same thing, but this time I only update the cost and gradient base on random batch of data. If this is SGD, begin will always same as end.



>I personally think that CUDA code is really ugly and difficult to work with

We could rely on the huge libraries developed by NVDIA and other open source community. Like mshadow(https://github.com/dmlc/mshadow) and Thrust

---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/pull/466#issuecomment-153624485
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20151103/6a2ab68f/attachment.html>


More information about the mlpack-git mailing list