Using GPU, multiply a 3D matrix by a 2D matrix (slicewise)
3 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Brad Hesse
am 26 Jul. 2016
Kommentiert: Edric Ellis
am 27 Jul. 2016
Hi everyone,
I am trying to vectorize the code for my neural network so that it can be quickly executed on the GPU.
I need to multiply each slice of 3D matrix X by a 2D matrix T.
X = 3D matrix.
T = 2D matrix.
Is there a good, fast way to do this using the GPU? I have heard it suggested to use 'repmat' to create a 3D matrix from the 2D matrix (duplicating it many times). But it feels wasteful and inefficient.
Thanks!
0 Kommentare
Akzeptierte Antwort
Edric Ellis
am 26 Jul. 2016
X = rand(10, 10, 4, 'gpuArray');
T = rand(10, 'gpuArray');
pagefun(@mtimes, X, T)
2 Kommentare
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Parallel and Cloud finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!