[mlpack-git] [mlpack] How to speed up HMM training ? (#550)
Ryan Curtin
notifications at github.com
Mon Mar 14 11:07:46 EDT 2016
>> It's worth noting that the GMM code you pasted where you iterate over each image and call gmm->Train(images[i], 1, true) is actually quite different than what hmm->Train(images) will do.
> Yes, I figured that out eventually and flattened all my sequence into one.
Okay, great. But I'm going to leave this ticket open, because it is still a problem that you needed to do this. I think that both of those training methodologies should provide the same result, but they are taking wildly different amounts of time. So maybe something is wrong.
>> Another thing worth noting is that, in general, training a GMM-HMM without labeled data is a long process that may not give very good results. It's much better to do labeled training if possible.
> Does that change anything when there is only one state ?
As far as results are concerned, probably not (since the inferred probability for each observation has to be 1 for the only state, which is what it would be for the labels anyway). For this implementation, though, different methods will be called to evaluate the probability of a GMM on labeled and unlabeled data. I wouldn't be surprised if the unlabeled approach trains a little bit slower (but I think any difference will be marginal, and I haven't tested exactly what the difference would be).
---
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/issues/550#issuecomment-196358786
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.cc.gatech.edu/pipermail/mlpack-git/attachments/20160314/2bdf35a5/attachment.html>
More information about the mlpack-git
mailing list