[mlpack-git] master: Fix -Wmaybe-uninitialized. (4ce5e6f)
gitdub at mlpack.org
gitdub at mlpack.org
Mon Feb 22 15:06:56 EST 2016
Repository : https://github.com/mlpack/mlpack
On branch : master
Link : https://github.com/mlpack/mlpack/compare/f3d692c0124e9667076b97318f6d64661015d368...4ce5e6f1d242c2f13399a2c4dda2bbf8916a8ac4
>---------------------------------------------------------------
commit 4ce5e6f1d242c2f13399a2c4dda2bbf8916a8ac4
Author: Ryan Curtin <ryan at ratml.org>
Date: Mon Feb 22 12:06:56 2016 -0800
Fix -Wmaybe-uninitialized.
In this case the warning is wrong, but I don't mind spending what, an extra
cycle when Adaboost starts?, to avoid the warning.
>---------------------------------------------------------------
4ce5e6f1d242c2f13399a2c4dda2bbf8916a8ac4
src/mlpack/methods/adaboost/adaboost_impl.hpp | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/src/mlpack/methods/adaboost/adaboost_impl.hpp b/src/mlpack/methods/adaboost/adaboost_impl.hpp
index 9800255..d2d16f0 100644
--- a/src/mlpack/methods/adaboost/adaboost_impl.hpp
+++ b/src/mlpack/methods/adaboost/adaboost_impl.hpp
@@ -73,7 +73,7 @@ void AdaBoost<WeakLearnerType, MatType>::Train(
// crt is the cumulative rt value for terminating the optimization when rt is
// changing by less than the tolerance.
- double rt, crt, alphat = 0.0, zt;
+ double rt, crt = 0.0, alphat = 0.0, zt;
ztProduct = 1.0;
More information about the mlpack-git
mailing list