[mlpack-git] [mlpack] Class to Finetune deep network (#460)

stereomatchingkiss notifications at github.com
Sun Oct 25 18:41:02 EDT 2015


Refactor the codes, now the api become

    template<typename EncoderType,
             typename OutputLayerType = regression::SoftmaxRegressionFunction,
             typename DataType = double,
             typename FineTuneGradient = SoftmaxFineTune>
    class FineTuneFunction
    {
     public:        
        /**
         * Construct the class with given data
         * @param input The input data of the LayerTypes and OutputLayerType
         * @param parameters The parameters of the LayerTypes and OutputLayerType
         * @param layerTypes The type(must be tuple) of the Layer(by now only support   SparseAutoencoder)
         * @param outLayerType The type of the last layer(ex : softmax)
         */
        FineTuneFunction(arma::Mat<DataType> &input,
                          EncoderType &encoder,
                          OutputLayerType &outLayerType,
                          arma::Mat<DataType> const &outLayerParam);
    };

construct it like

    auto stackAE = std::forward_as_tuple(sae1, sae2, sae3);
    //cannot get the softmax parameter from SoftmaxRegressionFunction, so we have to pass in
    FineTuneFunction<decltype(stackAE)> finetune(trainData, stackAE, softmaxFunction, softmaxParameter);

However, this class depend on the pull request #451, will wait until #451 is merged

---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/pull/460#issuecomment-150980897
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20151025/df4c08e3/attachment.html>


More information about the mlpack-git mailing list