How can I use NARX network on GPU?
Ältere Kommentare anzeigen
Hello.
I want to use GPU to speed up training of NARX. But it stopped at the 'Computing Resources: GPU device #1, GeForce GTX 750 Ti' i waited 3 hours but it say nothing.
My network is composed with 3 layers(Input layer,Intermediate layer,Output layer). it has 15 inputs and 1 output. so i decided to divide these inputs and output but i don't understand how to do it. Please tell me the way to divide training data.
Reference ver:matlab2017b
Program X = con2seq(input);% input is matfile which has size of 15×30001 T = con2seq(output);% input is matfile which has size of 1×30001
net = narxnet(1:2,1:4,5,'open'); [x,xi,ai,t] = preparets(net,X,[],T);
[net,tr] = train(net,x,ai,'useGPU','yes');
Akzeptierte Antwort
Weitere Antworten (0)
Kategorien
Mehr zu Deep Learning Toolbox finden Sie in Hilfe-Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!