GPU for HMM training
4 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Mate 2u
am 29 Apr. 2011
Kommentiert: shahab anjum
am 4 Mär. 2020
Hi I have a large dataset and when I used hidden markov model to train the data I have to wait a long time. Would GPU parallel computing speed this up. It seems the HMM (in the stats toolbox) is not mentioned as a supported function within the it.
Any experience or ideas?
0 Kommentare
Akzeptierte Antwort
sgmustadio
am 2 Mai 2011
This is exactly the problem I'm working on for my thesis. Unfortunately, the big HMM training algorithm (Baum-Welch) is recursive and cannot be parallelized (at least not easily). I'm looking into some work that Turin did on parallelizing the BWA here:
Additionally, the Segmental K-Means Algorithm is another way to train a HMM. My understanding is that it is computationally more efficient than BWA, but still suffers the same recursion problem.
Luckily, GPUs will help you if you are training many models or training a model with many states, as these are trivially parallel problems. From my research, you really only get benefits when you are talking about HMMS with thousands of states, as the overhead for GPUs is just too great up until that point.
If you're interested in using GPUs in MATLAB, checkout this free toolbox here:
Shoot me an email or a message if you need additional details, I'm glad to help :)
1 Kommentar
shahab anjum
am 4 Mär. 2020
dear
i have 1000x286 power spectral density matrix. i want to use HMM on that matrix please can you help how can i?
Weitere Antworten (2)
John Melonakos
am 7 Jun. 2011
Jacket has been used heavily for HMMs on the GPU in MATLAB. K-Means is also really fast. Here is some sample code:
% initial points
n = 1e6; % 3e6;
X = [randn(n,1) randn(n,1)] * 2;
ind = ceil(6*rand(n,1));
X(ind<3,1) = X(ind<3,1)/2 - 7;
X(ind==4,:) = X(ind==4,:)/3 + 4;
X = X';
% cluster
k = 20; % 5 clusters
X = gsingle(X);
[d n] = size(X);
assert(d == 2); % just for 2D case
% initial state
Z = X(:, ceil(1:n/k:n));
Z = reshape(Z, [2 1 k]);
% kmeans phase 1, 10 iterations
tic
for i = 1:10, i
% distances
D = bsxfun(@minus, X, Z);
D = sum(D .* D);
D = permute(D, [3 2 1]);
% minimum distances
[Y, I] = min(D);
% re-assign clusters
for j = 1:k
ii = find(I == j); if isempty(ii), error('empty cluster'); end
Z(:,1,j) = mean(X(:,ii), 2);
end
end
toc
0 Kommentare
engstudent
am 7 Dez. 2012
hi every one i am working in isolated word recognition and i need the step to use the hmm in matlab 2012, if any one can help me please send help to my email"sajasaad_eng@yahoo.com".
0 Kommentare
Siehe auch
Kategorien
Mehr zu Statistics and Machine Learning Toolbox finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!