In a FeedForward NNet, what exactly is one iteration?

1 Ansicht (letzte 30 Tage)
Sam Speake
Sam Speake am 23 Mai 2018
Kommentiert: Greg Heath am 25 Mai 2018
When you train a feedforward neural net with no changes, you see a GUI which includes "Epoch: 0 [ x iterations ] 1000" Does the x value represent the amount of pieces of data that were passed (such as 1 image from a data set of images), or does it represent a full pass of the entire data set?

Akzeptierte Antwort

Majid Farzaneh
Majid Farzaneh am 24 Mai 2018
Hello, In every neural network there is an optimization algorithm to set optimum weights and biases; and optimization algorithms are usually iterative. 1 epoch means one iteration in the optimization algorithm.
  3 Kommentare
Majid Farzaneh
Majid Farzaneh am 24 Mai 2018
Yes, that's true. In every change for weights, network needs to calculate MSE and for MSE it needs to classify all training data with new weights.
Greg Heath
Greg Heath am 25 Mai 2018
Optimization algorithms TRY to optimize the goal. Many/most times they do not achieve the goal.
Nevertheless, they are often considered successful if they just get close enough.
For example, I often design neural networks to yield an output target t, given an input function x.
I take as a reference output
yref = mean(t')
the corresponding mean square error is
MSEref = mean(var(t',1))
My training goal is typically
MSEgoal = 0.01*MSEref
which preserves 99% of the target variance,

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Kategorien

Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange

Produkte

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by