[mlpack-git] [mlpack] Fix drop out layer (#463)
Marcus Edel
notifications at github.com
Mon Nov 9 16:52:52 EST 2015
It looks like we can solve the issues by introducing a function ```void Ratio(double ratio)``` to set the ratio parameter and to set the scale parameter once the ratio is changed.
```
void Ratio(double r)
{
ratio = r;
scale = 1.0 / (1.0 - ratio);
}
```
To avoid passing through the input matrix twice if the deterministic and rescale is true, we should rewrite the block as @rcurtin suggested:
```
if (deterministic)
{
if (!rescale)
output = input;
else
output = input * scale;
}
```
---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/pull/463#issuecomment-155209995
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20151109/45526c3f/attachment.html>
More information about the mlpack-git
mailing list