I had the same result when used make_tuple rather than tie. I fixed it<br>
though in my own fork:<br>
http://github.com/theSundayProgrammer/mlpack<br>
branch:mytweaks<br>
SHA1: 540dd7c3b3a5f3857eed922d446bfeb5016dbcd9<br>
<br>
Although it is for CNN not FNN, I guess a similar change for FNN should<br>
work too<br>
<br>
On Thu, Mar 3, 2016 at 6:33 AM, Joe Mariadassou <joe.mariadassou@gmail.com><br>
wrote:<br>
<br>
> I had the same result when used make_tuple rather than tie. I fixed it<br>
> though in my own fork:<br>
> http://github.com/theSundayProgrammer/mlpack<br>
> fork:mytweaks<br>
> SHA1: 540dd7c3b3a5f3857eed922d446bfeb5016dbcd9<br>
><br>
> Although it is for CNN not FNN, I guess a similar change for FNN should<br>
> work too<br>
><br>
><br>
> On Thu, Mar 3, 2016 at 3:29 AM, sudarshan <notifications@github.com><br>
> wrote:<br>
><br>
>> Thank you. I checked out pr/536 and linked the include directories and<br>
>> lib from there. I got rid off all the compiler error. However, I get a 100%<br>
>> classification error when I use std::make_tuple, but when I use std::tie I<br>
>> get 4.95916%. This is my funciton:<br>
>><br>
>> auto BuildFFN(MatType& trainData, MatType& trainLabels, MatType& testData, MatType& testLabels, const size_t hiddenLayerSize)<br>
>> {<br>
>> // input layer<br>
>> ann::LinearLayer<> inputLayer(trainData.n_rows, hiddenLayerSize);<br>
>> ann::BiasLayer<> inputBiasLayer(hiddenLayerSize);<br>
>> ann::BaseLayer<PerformanceFunction> inputBaseLayer;<br>
>><br>
>> // hidden layer<br>
>> ann::LinearLayer<> hiddenLayer1(hiddenLayerSize, trainLabels.n_rows);<br>
>> ann::BiasLayer<> hiddenBiasLayer1(trainLabels.n_rows);<br>
>> ann::BaseLayer<PerformanceFunction> outputLayer;<br>
>><br>
>> // output layer<br>
>> OutputLayerType classOutputLayer;<br>
>><br>
>> auto modules = std::tie(inputLayer, inputBiasLayer, inputBaseLayer, hiddenLayer1, hiddenBiasLayer1, outputLayer);<br>
>> ann::FFN<decltype(modules), decltype(classOutputLayer), ann::RandomInitialization, PerformanceFunctionType> net(modules, classOutputLayer);<br>
>><br>
>> net.Train(trainData, trainLabels);<br>
>> arma::mat prediction;<br>
>> net.Predict(testData, prediction);<br>
>><br>
>> double classificationError;<br>
>> for (size_t i = 0; i < testData.n_cols; i++)<br>
>> {<br>
>> if (arma::sum(arma::sum(arma::abs(prediction.col(i) - testLabels.col(i)))) != 0)<br>
>> {<br>
>> classificationError++;<br>
>> }<br>
>> }<br>
>><br>
>> std::cout << "Classification Error = " << (double(classificationError) / testData.n_cols) * 100 << "%" << std::endl;<br>
>><br>
>> return net;<br>
>> }<br>
>><br>
>> —<br>
>> Reply to this email directly or view it on GitHub<br>
>> <https://github.com/mlpack/mlpack/issues/531#issuecomment-191313956>.<br>
>><br>
><br>
><br>
><br>
> --<br>
> Joseph Chakravarti Mariadassou<br>
> http://thesundayprogrammer.com<br>
><br>
><br>
<br>
<br>
-- <br>
Joseph Chakravarti Mariadassou<br>
http://thesundayprogrammer.com<br>
<p style="font-size:small;-webkit-text-size-adjust:none;color:#666;">—<br>Reply to this email directly or <a href="https://github.com/mlpack/mlpack/issues/531#issuecomment-191417423">view it on GitHub</a>.<img alt="" height="1" src="https://github.com/notifications/beacon/AJ4bFLh6hcLOKL6cG552jBxoAjuDt-0Aks5ppfRegaJpZM4HmbCC.gif" width="1" /></p>
<div itemscope itemtype="http://schema.org/EmailMessage">
<div itemprop="action" itemscope itemtype="http://schema.org/ViewAction">
<link itemprop="url" href="https://github.com/mlpack/mlpack/issues/531#issuecomment-191417423"></link>
<meta itemprop="name" content="View Issue"></meta>
</div>
<meta itemprop="description" content="View this Issue on GitHub"></meta>
</div>