[mlpack-svn] r13819 - mlpack/trunk/src/mlpack/methods/nca
fastlab-svn at coffeetalk-1.cc.gatech.edu
fastlab-svn at coffeetalk-1.cc.gatech.edu
Thu Nov 1 12:58:23 EDT 2012
Author: rcurtin
Date: 2012-11-01 12:58:23 -0400 (Thu, 01 Nov 2012)
New Revision: 13819
Modified:
mlpack/trunk/src/mlpack/methods/nca/nca_softmax_error_function_impl.hpp
Log:
We were neglecting a term in the gradient...
Modified: mlpack/trunk/src/mlpack/methods/nca/nca_softmax_error_function_impl.hpp
===================================================================
--- mlpack/trunk/src/mlpack/methods/nca/nca_softmax_error_function_impl.hpp 2012-11-01 16:58:10 UTC (rev 13818)
+++ mlpack/trunk/src/mlpack/methods/nca/nca_softmax_error_function_impl.hpp 2012-11-01 16:58:23 UTC (rev 13819)
@@ -153,12 +153,13 @@
if (i == k)
continue;
- // Calculate p_ik.
+ // Calculate the numerator of p_ik.
double eval = exp(-metric.Evaluate(stretchedDataset.unsafe_col(i),
stretchedDataset.unsafe_col(k)));
// If the points are in the same class, we must add to the second term of
- // the gradient as well as the numerator of p_i.
+ // the gradient as well as the numerator of p_i. We will divide by the
+ // denominator of p_ik later.
if (labels[i] == labels[k])
{
numerator += eval;
@@ -168,7 +169,7 @@
}
// We always have to add to the denominator of p_i and the first term of the
- // gradient computation.
+ // gradient computation. We will divide by the denominator of p_ik later.
denominator += eval;
firstTerm += eval *
(stretchedDataset.col(i) - stretchedDataset.col(k)) *
@@ -180,7 +181,11 @@
if (denominator == 0)
Log::Warn << "Denominator of p_" << i << " is 0!" << std::endl;
else
+ {
p = numerator / denominator;
+ firstTerm /= denominator;
+ secondTerm /= denominator;
+ }
// Now multiply the first term by p_i, and add the two together and multiply
// all by 2 * A. We negate it though, because our optimizer is a minimizer.
More information about the mlpack-svn
mailing list