[mlpack-git] [mlpack] implement CNN and LeNet1 (#405)

Marcus Edel notifications at github.com
Mon Mar 16 17:19:47 EDT 2015


Typically, convolutions in CNNs involve relatively small filters, so often it isn't beneficial to go with FFT. Since we need to transform the input using FFT and afterwards we need to transform the product of the transformed inputs back using the inverse FFT. So as you pointed out FFT performs better than the simple loop convolution. There is an interesting paper: "Fast Training of Convolutional Networks through FFTs" by Michael Mathieu, Mikael Henaff and Yann LeCun.

The code looks good, but I think we should make the convolution method a template parameter like we did it with the weight initialization method. That would give the user to implement other methods to do the convolution. What do you think?


I think the code is almost ready to be merged, two more points:

- Could you remove the debug messages? I know there is no downside when building with DEBUG=OFF.
- Maybe we can find another test which checks the code that takes under 10 minutes? We partially use travis to run all tests. Unfortunately the time is limited, so we are trying to minimize the time for a single testsuite. Maybe we can decrease the numbers of epochs by using a pre trained network?

---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/pull/405#issuecomment-81947242
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20150316/fbbb4429/attachment.html>


More information about the mlpack-git mailing list