[mlpack-git] [mlpack] Added LeakyReLU activation layer and its test. (#544)

Dhawal Arora notifications at github.com
Thu Mar 3 01:50:10 EST 2016


Hi, so i have added LeakyReLU and its tests. Also adding hard tanh activation. Please check if any modifications. I have not tested the Boost tests on my system yet. I might add PReLU as well if you'll agree, that will require training alpha as well. Thanks. 
You can view, comment on, or merge this pull request online at:

  https://github.com/mlpack/mlpack/pull/544

-- Commit Summary --

  * Added LeakyReLU activation layer and its test

-- File Changes --

    A src/mlpack/methods/ann/activation_functions/leaky_relu_function.hpp (230)
    M src/mlpack/tests/activation_functions_test.cpp (71)

-- Patch Links --

https://github.com/mlpack/mlpack/pull/544.patch
https://github.com/mlpack/mlpack/pull/544.diff

---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/pull/544
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20160302/ffc8c639/attachment.html>


More information about the mlpack-git mailing list