Training Option! How can we use a new defined algorithm (as a training function) to train a Deep Neural Network?!

2 Ansichten (letzte 30 Tage)
Hi every one. This is the first time I am designing a deep network in Matlab. I do not want use SGD, Adam or default solver. I have a new algorithm (proposed unconstrained optimizar) that I am interested in using it to train the network to chech its performance in training. How can I do it? Is it possible to do, how?

Akzeptierte Antwort

Srivardhan Gadila
Srivardhan Gadila am 30 Okt. 2020
  2 Kommentare
MAHSA YOUSEFI
MAHSA YOUSEFI am 10 Nov. 2020
Dear Srivardhan,
following with your answer to use training loop, I have another problem.
I am trying to train a CNN with my own optimizer through costum training loop:
[loss,gradient]= dlfeval(@modelgradient,dlnet, Xtrian,YTrain)
myFun = @(dlnet,gradient,loss)myOptimizer(dlnet,gradient,loss,...)
dlnet = dlupdate(myFun,dlnet,gradient,loss)
My optimizer needs w (current parameter vector), g (its corresponding gradient vector), f (its corresponding loss value) and… as inputs. This optimizer needs many computations with w, g, f inside to give w = w + p, p is a optimal vector that my optimizer has to compute it by which I can update my w.
I need something by which I can convert the parameters and gradient of dl format to vectors for those computations inside of my optimizer, then to use above syntax I need to convert vector to dl formats required in loop and in my optimizer as well. This back and forth is necessary for my job for using training loop. Can you help to find functions in the toolbox to do these jobs (vector to table (because gradient and dlnet’s parameters are tables with dlarray cells) and vice versa), or any other solutions?

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Kategorien

Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by