[mlpack-git] [mlpack] Fix drop out layer (#463)

Ryan Curtin notifications at github.com
Tue Oct 27 09:42:33 EDT 2015


> This one I am not sure how good the optimization of armadillo can do, it would do things like
> 
> arma::mat randomMat;// build a random matrix
> mask = randomMat; //copy the data of randomMat to mask
>
> or overwrite the data of mask directly since the mask already allocate the buffer?

I'm not sure what you mean... in that code snippet, if `mask` is already initialized, then it will overwrite the data.

In your example, yes, Armadillo is smart enough to avoid allocating memory because it is always the same size.  But if you were to choose the size randomly inside your `while` loop, then memory would be reallocated for every iteration.

Basically what I am getting at is that the `mask` matrix is unnecessary: it's only being used to store random values, which are being accessed sequentially and only once.  So why not just generate the random values as they are needed?  i.e.

```
for (size_t i = 0; i < input.n_elem; ++i)
{
  const double rand = math::Random();
  if (rand > ratio)
    output = input[i];
  else
    output = 0.0;
}
```

---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/pull/463#issuecomment-151500400
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20151027/e9207126/attachment.html>


More information about the mlpack-git mailing list