Borrar filtros
Borrar filtros

GPU supoort in Multilabel Example?

3 visualizaciones (últimos 30 días)
Moritz Scherrmann
Moritz Scherrmann el 19 de Jun. de 2020
Editada: Moritz Scherrmann el 19 de Jun. de 2020
Hi all,
I am currently working with the following matlab example:
https://de.mathworks.com/help/deeplearning/ug/multilabel-text-classification-using-deep-learning.html
My problem is that even though the "canUseGPU" function returns true (I am working on a Geforce RTX 2080 Ti on my local machine),
I think that the example does not run on the GPU. However, In the example, it is stated that it should run on the GPU (line 181 - 183 in the live script):
% If training on a GPU, then convert data to gpuArray.
if (executionEnvironment == "auto" && canUseGPU) || executionEnvironment == "gpu"
dlX = gpuArray(dlX);
end
When i run the code and start training, the task manager tells that there is no activity on the GPU.
Furthermore, I saw that the following embedding function removes the gpu-Arrray property of the data X (line 387-397):
function Z = embedding(X, weights)
% Reshape inputs into a vector.
[N, T] = size(X, 2:3);
X = reshape(X, N*T, 1);
% Index into embedding matrix.
Z = weights(:, X);
% Reshape outputs by separating batch and sequence dimensions.
Z = reshape(Z, [], N, T);
end
I assume that that leads to a CPU usage instead of the GPU.
Can anybody halp me to solve the issue?
Thank you very much!

Respuestas (0)

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by