[mlpack-git] [mlpack] improve speed of SparseAutoencoder and make it more flexible (#451)

Ryan Curtin notifications at github.com
Tue Sep 22 22:47:46 EDT 2015


> + *   //using SAEF = nn::SparseAutoencoderFunction;
> + *
> + *   size_t const Features = 16*16;
> + *   arma::mat data = randu<mat>(Features, 10000);
> + *
> + *   SAEF encoderFunction(data, Features, Features / 2);
> + *   const size_t numIterations = 100; // Maximum number of iterations.
> + *   const size_t numBasis = 10;
> + *   optimization::L_BFGS<SAEF> optimizer(encoderFunction, numBasis, numIterations);
> + *
> + *   arma::mat parameters = encoderFunction.GetInitialPoint();
> + *
> + *   // Train the model.
> + *   Timer::Start("sparse_autoencoder_optimization");
> + *   const double out = optimizer.Optimize(parameters);
> + *   Timer::Stop("sparse_autoencoder_optimization");

The API here is the way Siddharth originally wrote it, using the standard mlpack optimizers to optimize the weights of the network.  But maybe it would be a better idea to make this class work with the `Trainer` class in `src/mlpack/methods/ann/`?  Kind of like the examples in `convolutional_network_test.cpp` and `feedforward_network_test.cpp`:

```
SAEF autoencoder(...);
Trainer<SAEF> trainer(autoencoder, ...);
trainer.Train(...);
```

This would help make all of the ANN-related code in mlpack have a unified interface.  I'd be interested in zoq's comments on this too, since I don't know his plans for what the API there will eventually look like.

---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/pull/451/files#r40166914
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20150922/720d0955/attachment-0001.html>


More information about the mlpack-git mailing list