Using GPU, multiply a 3D matrix by a 2D matrix (slicewise)
2 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Brad Hesse
el 26 de Jul. de 2016
Comentada: Edric Ellis
el 27 de Jul. de 2016
Hi everyone,
I am trying to vectorize the code for my neural network so that it can be quickly executed on the GPU.
I need to multiply each slice of 3D matrix X by a 2D matrix T.
X = 3D matrix.
T = 2D matrix.
Is there a good, fast way to do this using the GPU? I have heard it suggested to use 'repmat' to create a 3D matrix from the 2D matrix (duplicating it many times). But it feels wasteful and inefficient.
Thanks!
0 comentarios
Respuesta aceptada
Edric Ellis
el 26 de Jul. de 2016
X = rand(10, 10, 4, 'gpuArray');
T = rand(10, 'gpuArray');
pagefun(@mtimes, X, T)
2 comentarios
Más respuestas (0)
Ver también
Categorías
Más información sobre Parallel and Cloud en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!