[mlpack-git] [mlpack] [Proposal]Develop a scalable Finetune class to fine tune the paramters of deep network (#458)
stereomatchingkiss
notifications at github.com
Wed Oct 7 00:18:49 EDT 2015
I do some change on the api of FineTuneGradient, this api may provide better optimization codes
class SoftmaxFineTune
{
public:
template<typename T>
static void LastGradient(arma::mat const &input,
arma::mat const &weights,
T const &model,
arma::mat &gradient)
{
gradient = (weights.t() * model.Probabilities()) /
static_cast<double>(input.n_cols);
gradient = gradient % (input % (1 - input));
}
static void Gradient(arma::mat const &input,
arma::mat const &weight,
arma::mat const &deriv,
arma::mat &output)
{
output = (weight.t() * deriv) % (input % (1 - input));
}
};
---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/issues/458#issuecomment-146073692
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20151006/53f81788/attachment.html>
More information about the mlpack-git
mailing list