GPU for HMM training

5 visualizaciones (últimos 30 días)
Mate 2u
Mate 2u el 29 de Abr. de 2011
Comentada: shahab anjum el 4 de Mzo. de 2020
Hi I have a large dataset and when I used hidden markov model to train the data I have to wait a long time. Would GPU parallel computing speed this up. It seems the HMM (in the stats toolbox) is not mentioned as a supported function within the it.
Any experience or ideas?

Respuesta aceptada

sgmustadio
sgmustadio el 2 de Mayo de 2011
This is exactly the problem I'm working on for my thesis. Unfortunately, the big HMM training algorithm (Baum-Welch) is recursive and cannot be parallelized (at least not easily). I'm looking into some work that Turin did on parallelizing the BWA here:
Additionally, the Segmental K-Means Algorithm is another way to train a HMM. My understanding is that it is computationally more efficient than BWA, but still suffers the same recursion problem.
Luckily, GPUs will help you if you are training many models or training a model with many states, as these are trivially parallel problems. From my research, you really only get benefits when you are talking about HMMS with thousands of states, as the overhead for GPUs is just too great up until that point.
If you're interested in using GPUs in MATLAB, checkout this free toolbox here:
Shoot me an email or a message if you need additional details, I'm glad to help :)
  1 comentario
shahab anjum
shahab anjum el 4 de Mzo. de 2020
dear
i have 1000x286 power spectral density matrix. i want to use HMM on that matrix please can you help how can i?

Iniciar sesión para comentar.

Más respuestas (2)

John Melonakos
John Melonakos el 7 de Jun. de 2011
Jacket has been used heavily for HMMs on the GPU in MATLAB. K-Means is also really fast. Here is some sample code:
% initial points
n = 1e6; % 3e6;
X = [randn(n,1) randn(n,1)] * 2;
ind = ceil(6*rand(n,1));
X(ind<3,1) = X(ind<3,1)/2 - 7;
X(ind==4,:) = X(ind==4,:)/3 + 4;
X = X';
% cluster
k = 20; % 5 clusters
X = gsingle(X);
[d n] = size(X);
assert(d == 2); % just for 2D case
% initial state
Z = X(:, ceil(1:n/k:n));
Z = reshape(Z, [2 1 k]);
% kmeans phase 1, 10 iterations
tic
for i = 1:10, i
% distances
D = bsxfun(@minus, X, Z);
D = sum(D .* D);
D = permute(D, [3 2 1]);
% minimum distances
[Y, I] = min(D);
% re-assign clusters
for j = 1:k
ii = find(I == j); if isempty(ii), error('empty cluster'); end
Z(:,1,j) = mean(X(:,ii), 2);
end
end
toc

engstudent
engstudent el 7 de Dic. de 2012
hi every one i am working in isolated word recognition and i need the step to use the hmm in matlab 2012, if any one can help me please send help to my email"sajasaad_eng@yahoo.com".

Categorías

Más información sobre Statistics and Machine Learning Toolbox en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by