GPU computing doesn't work with timedelaynet, inputDelays != 0:0
1 Ansicht (letzte 30 Tage)
Ältere Kommentare anzeigen
Hello,
So, im in the process of converting a functional feedforwardnet to a timedelaynet and matlab hangs whenever I try to use the flag, 'useGPU','yes'. The nearly identical feedforwardnet, as well as the timedelaynet with inputDelays = 0:0 works, but setting inputDelays to 0:1 (or any other spread with length > 1) causes an infinite hang. Code presented below. Any ideas?
Thank you!
(2015b)
Variables:
input: 19x4100 double
output: 1x4100 double
trainFcn = 'trainscg';
hiddenLayerSize = [10,10];
inputDelays = 0:1;
net = timedelaynet(inputDelays,hiddenLayerSize,trainFcn);
X = tonndata(input,true,false);
T = tonndata(output,true,false);
[x,xi,ai,t] = preparets(net,X,T);
net.divideParam.trainRatio = 80/100;
net.divideParam.testRatio = 10/100;
net.divideParam.valRatio = 10/100;
net.trainParam.max_fail = 25;
net.performFcn = 'mse';
[net,tr] = train(net,x,t,xi,ai,'useGPU','yes');
0 Kommentare
Antworten (1)
Mark Hudson Beale
am 20 Dez. 2015
Currently the GPU implementation of training does not parallelize for single series. If you have a long series and can break it up into many shorter series, then it will parallelize well.
For instance, if your series looks like this, it will run on only one core which will be slow:
input dimensions = num_inputs-by-timesteps cell of Nx1 values
target dimensions = num_outputs-by-timesteps cell of Mx1 values
If you have multiple series, your data will look like this, where Q is the number of series being trained on or evaluated in parallel:
input dimensions = num_inputs-by-timesteps cell of NxQ values
target dimensions = num_outputs-by-timesteps cell of MxQ values
The larger the number of series Q, the more parallelism on the GPU.
0 Kommentare
Siehe auch
Kategorien
Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!