[mlpack-git] [mlpack] Added LeakyReLU (and hardtanh) activation layer and its test. (#544)

Marcus Edel notifications at github.com
Thu Mar 3 13:33:04 EST 2016


I would appreciate it if you could open a pull request for the LeakyReLU layer and another pull request for the HardTanh layer. It makes it a lot easier for me to go through the code and make comments.

Regarding the test, everything between min and max should be 1 min >= x <= max.

As you might have already noticed, writing tests in mlpack is done with the Boost Unit Test Framework. If you write a test suite called "TestSuite" (BOOST_AUTO_TEST_SUITE(TestSuite)), and then build 'mlpack_test' ('make mlpack_test'), you can run only the tests in that test suite with 'bin/mlpack_test -t TestSuite'.  A specific test case called 'TestCase' (BOOST_AUTO_TEST_CASE(TestCase)) could be run with 'bin/mlpack_test -t TestSuite/TestCase'.

---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/pull/544#issuecomment-191900999
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20160303/92fb8256/attachment.html>


More information about the mlpack-git mailing list