[mlpack-git] master: fix bug--forgot to initialize parameter scale (6e7239c)

gitdub at big.cc.gt.atl.ga.us gitdub at big.cc.gt.atl.ga.us
Fri Nov 20 11:16:37 EST 2015


Repository : https://github.com/mlpack/mlpack

On branch  : master
Link       : https://github.com/mlpack/mlpack/compare/c99256848a352da3c59610c8e37a21d0d5be9736...d69ac29a1c33b4e303b79ac7af939cc4cb37edd4

>---------------------------------------------------------------

commit 6e7239cfd23c32499134800b700140a5da26bcfe
Author: stereomatchingkiss <stereomatchingkiss at gmail.com>
Date:   Wed Oct 21 05:18:32 2015 +0800

    fix bug--forgot to initialize parameter scale


>---------------------------------------------------------------

6e7239cfd23c32499134800b700140a5da26bcfe
 src/mlpack/methods/ann/layer/dropout_layer.hpp | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/src/mlpack/methods/ann/layer/dropout_layer.hpp b/src/mlpack/methods/ann/layer/dropout_layer.hpp
index 9c7511b..f51f8b2 100644
--- a/src/mlpack/methods/ann/layer/dropout_layer.hpp
+++ b/src/mlpack/methods/ann/layer/dropout_layer.hpp
@@ -58,7 +58,8 @@ class DropoutLayer
   DropoutLayer(const double ratio = 0.5,
                const bool rescale = true) :
       ratio(ratio),
-      rescale(rescale)
+      rescale(rescale),	  
+	  scale(1.0 / (1.0 - ratio))
   {
     // Nothing to do here.
   }



More information about the mlpack-git mailing list