[mlpack-git] [mlpack] Add Dropout/DropConnect (#413)

Marcus Edel notifications at github.com
Fri Mar 4 15:34:38 EST 2016


@theaverageguy you are right, I really like the images from the authors: http://cs.nyu.edu/~wanli/dropc/.

So, the implementation of the DropConnectLayer isn't that different as of the DropoutLayer. So you can use the DropoutLayer as a basis. So imagine you like to create a simple feedforward network something like that:

```
LinearLayer<> inputLayer(10, 2);
BiasLayer<> inputBiasLayer(2);
ReLULayer<> inputBaseLayer;

LinearLayer<> hiddenLayer(2, 10);
ReLULayer<> outputLayer;
```

Now we would like to use DropConnect between the input and the first hidden layer, so we what we need to do here is to set randomly weights from the inputLayer to 0. So let us modify our feedforward network so that it uses this new DropConnectLayer:

```
LinearLayer<> inputLayer(10, 2);
DropConnectLayer<> (inputLayer, 0.5, true);

BiasLayer<> inputBiasLayer(2);
ReLULayer<> inputBaseLayer;

LinearLayer<> hiddenLayer(2, 10);
ReLULayer<> outputLayer;
```

As you can see the Constructor of the DropConnectLayer is similar to the DropoutLayer but takes an additional parameter:

```
DropConnectLayer(layer, ratio, rescale)
```

In this case, we use layer (LinearLayer) inside the DropConnect Layer for the weight modification. So the ```Forward(...)``` function should look like

```
void Forward(input, output)
{
    layer.Weights() % mask;
    layer.Forward(input, output);    
}
```

We modify the weights of the layer and then use the ```Forward()``` function of the layer. The ```Backward()``` function is similar.


I hope this is helpful.

---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/issues/413#issuecomment-192457565
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20160304/8816ba04/attachment-0001.html>


More information about the mlpack-git mailing list