Adam Optimizer with feedforward nueral networks

14 Ansichten (letzte 30 Tage)
Manos Kav
Manos Kav am 30 Apr. 2018
Kommentiert: Bob am 18 Nov. 2022
Hello, is there any way to use Adam optimizer to train a neural network with the "train" fucntion? Or a way to use this implementation ( https://www.mathworks.com/matlabcentral/fileexchange/61616-adam-stochastic-gradient-descent-optimization ) to train the network?
Thanks in advance.
  2 Kommentare
Abdelwahab Afifi
Abdelwahab Afifi am 14 Jun. 2020
Have you get the answer ?
Bob
Bob am 18 Nov. 2022
did anyone of you guys got the answer?

Melden Sie sich an, um zu kommentieren.

Antworten (1)

Hrishikesh Borate
Hrishikesh Borate am 19 Jun. 2020
Hi,
It’s my understanding that you want to use Adam optimizer to train a neural network. This can be done using trainNetwork function, and setting the appropriate Training Options.
For eg.,
[XTrain,~,YTrain] = digitTrain4DArrayData;
layers = [ ...
imageInputLayer([28 28 1])
convolution2dLayer(12,25)
reluLayer
fullyConnectedLayer(1)
regressionLayer];
options = trainingOptions('adam', ...
'InitialLearnRate',0.001, ...
'GradientThreshold',1, ...
'Verbose',false, ...
'Plots','training-progress');
net = trainNetwork(XTrain,YTrain,layers,options);
[XTest,~,YTest] = digitTest4DArrayData;
YPred = predict(net,XTest);
rmse = sqrt(mean((YTest - YPred).^2))
For more information, refer to trainNetwork.
  1 Kommentar
Abdelwahab Afifi
Abdelwahab Afifi am 19 Jun. 2020
'trainNetwork' is used for Deep learning neural network. But I think he wanna use 'Adam optimizer' to train shallow neural network using 'train' function.

Melden Sie sich an, um zu kommentieren.

Kategorien

Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by