[mlpack-git] master: Use the correct input to accumulate the elements. Thanks to Shangtong for pointing this out. (bb6e5c5)
gitdub at big.cc.gt.atl.ga.us
gitdub at big.cc.gt.atl.ga.us
Fri Jan 30 05:48:55 EST 2015
Repository : https://github.com/mlpack/mlpack
On branch : master
Link : https://github.com/mlpack/mlpack/compare/80e6f6692e8c20743b3ad7f61d50d98cb1fbc3dc...bb6e5c56aab07e6449d86021246b52a9e323f3a0
>---------------------------------------------------------------
commit bb6e5c56aab07e6449d86021246b52a9e323f3a0
Author: Marcus Edel <marcus.edel at fu-berlin.de>
Date: Fri Jan 30 11:48:41 2015 +0100
Use the correct input to accumulate the elements. Thanks to Shangtong for pointing this out.
>---------------------------------------------------------------
bb6e5c56aab07e6449d86021246b52a9e323f3a0
src/mlpack/methods/ann/layer/softmax_layer.hpp | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/src/mlpack/methods/ann/layer/softmax_layer.hpp b/src/mlpack/methods/ann/layer/softmax_layer.hpp
index 63af854..1583035 100644
--- a/src/mlpack/methods/ann/layer/softmax_layer.hpp
+++ b/src/mlpack/methods/ann/layer/softmax_layer.hpp
@@ -48,7 +48,7 @@ class SoftmaxLayer
void FeedForward(const VecType& inputActivation, VecType& outputActivation)
{
outputActivation = arma::trunc_exp(inputActivation);
- outputActivation /= arma::accu(inputActivation);
+ outputActivation /= arma::accu(outputActivation);
}
/**
More information about the mlpack-git
mailing list