How to run my code in GPU instead of CPU?

19 views (last 30 days)
Hello everyone,
I'm trying to run my for loop using GPU instead of CPU, but i searched a lot and it turned out that I should rewrite my code again
but some people said it's not necessary to rewrite it, you could use some tool box or something like that
so, if anyone could help me how to use these tool boxes or how can i rewrite my code in GPU form, please answer me
This is my code:
x = [1 5 9];% input features
x = [1/9 5/9 1]; % normlize
y = [0.49 0.51]; % outputs
min_error = 100;
for i = 1:10000
w1 = rand(3,3);
b1 = rand(1,3); % first bias , 1 value going to 3 neurons
w2 = rand(3,3);
b2 = rand(1,3); % second bias
w3 = rand(3,2);
b3 = rand(1,2); % going to 2 neurons
% L1 input layer, L2 and L3 hidden layers , L4 output layer
% L1:
a1 = x;
% L1 - L2:
zh1 = (x * w1) + b1;
a2 = logsig(zh1);
% L2 - L3:
zh2 = (a2 * w2) + b2;
a3 = logsig(zh2);
% L3 - L4:
zh3 = (a3 * w3) + b3;
a4 = logsig(zh3);
err = abs(y-a4);
percentage(i) = (err / y) * 100;
if percentage(i) < min_error
min_error = percentage(i);
cw1 = w1;
cw2 = w2;
cw3 = w3;
bw1 = b1;
bw2 = b2;
bw3 = b3;

Accepted Answer

Walter Roberson
Walter Roberson on 13 Feb 2021
It is correct that you would have to rewrite your code to use GPU. At the very least you would have to make some of your computations into gpuarray() .
However, your operations are too small to be worth doing on the GPU. GPU is best at doing the same thing to many data points, and at its worst when doing short calculations, because of the communications and synchronization overheads.
You would, I think, do much better to vectorize your calculations.
In particular, R2020b and later have pagemtimes()
so you could
N = 10000;
w1 = rand(3,3,N);
b1 = rand(1,3,N); % first bias , 1 value going to 3 neurons
zh1 = pagemtimes(x, w1) + b1;
a2 = logsig(zh1);
and similar operations. At the end you would get a vector of err, and would min() that extracting the index and using the index to pull out the appropriate page from w1, b1, and so on.
If that does not in itself speed things up enough, then those vectorized calculations would be suitable for processing on the GPU.
  1 Comment
Abdulaziz Alotaibi
Abdulaziz Alotaibi on 13 Feb 2021
Thanks a lot , I'm just getting started so I thought it's a good idea to use GPU instead of CPU because in further application i'm gonna biuld a CNN model and train it and so on.
GPU will be much faster in that kind of applications right?

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by