[mlpack-git] [mlpack] Fix drop out layer (#463)
Marcus Edel
notifications at github.com
Thu Oct 22 07:16:45 EDT 2015
> Sorry for my misunderstanding, I should open an issue before I treat it as a bug next time.
No worries, I guess the pull request makes it sometimes easier to talk about changes.
> Do you mean you want to set the ration when you call the Forward function?
I would put ``scale = 1.0 / (1.0 - ratio);``at the beginning of the ``Forward(...)`` function, so the ``scale`` value is calculated even in prediction only mode. Additionally a user could change the scale value afterwards through the ``Ratio()`` function.
```
template<typename eT>
void Forward(const arma::Mat<eT>& input, arma::Mat<eT>& output)
{
scale = 1.0 / (1.0 - ratio);
// The dropout mask will not be multiplied in the deterministic mode
// (during testing).
if (deterministic)
{
output = input;
if (rescale)
output *= scale;
}
else
{
// Scale with input / (1 - ratio) and set values to zero with probability
// ratio.
mask = arma::randu<arma::Mat<eT> >(input.n_rows, input.n_cols);
mask.transform( [&](double val) { return (val > ratio); } );
output = input % mask * scale;
}
}
```
---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/pull/463#issuecomment-150184035
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20151022/8832c1ba/attachment.html>
More information about the mlpack-git
mailing list