Using GPU, multiply a 3D matrix by a 2D matrix (slicewise)

2 visualizaciones (últimos 30 días)
Brad Hesse
Brad Hesse el 26 de Jul. de 2016
Comentada: Edric Ellis el 27 de Jul. de 2016
Hi everyone,
I am trying to vectorize the code for my neural network so that it can be quickly executed on the GPU.
I need to multiply each slice of 3D matrix X by a 2D matrix T.
X = 3D matrix.
T = 2D matrix.
Is there a good, fast way to do this using the GPU? I have heard it suggested to use 'repmat' to create a 3D matrix from the 2D matrix (duplicating it many times). But it feels wasteful and inefficient.
Thanks!

Respuesta aceptada

Edric Ellis
Edric Ellis el 26 de Jul. de 2016
In this case, you can use pagefun. For example:
X = rand(10, 10, 4, 'gpuArray');
T = rand(10, 'gpuArray');
pagefun(@mtimes, X, T)
  2 comentarios
Brad Hesse
Brad Hesse el 27 de Jul. de 2016
Oh my god! I am speechless! My entire forward/backward propagation algorithm worked, on the very first try, after completely re-writing it to be vectorized for GPU execution. This is at least a 40-50 fold speed improvement (my original code obviously wasn't even very well optimized for CPU execution).
I cannot believe how fast this is.
Thank you so much for your help Edric. I had actually already tried using the pagefun function, but it failed and I assumed it didn't work with 3D matrices x 2D matrices.
Edric Ellis
Edric Ellis el 27 de Jul. de 2016
Glad it worked for you!

Iniciar sesión para comentar.

Más respuestas (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by