When I should stop training a neural network?
4 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Rodrigo Beltran
am 24 Jan. 2019
Bearbeitet: Greg Heath
am 26 Jan. 2019
I'm working in a neural network with BackPropagation. The network has 6 inputs, 1 hidden layer (6 neurons on that layer) and 1 output. I train the network with algorithms "Levenberg-Marquardt" and "Bayesian Regularization". So, the idea is can "predict" a result but the results are not the right ones according to the table with the historical data.
To stop the training, for the moment, I look the "regression plot", the "Mean squared Error" and "Regression R Values", wich have the "ideal values" but still the results are not accurates and are not even close with datas who "doesn't exist" in the table with the historical data.
What graphic should I look at to know the network is not overfitting or is correctly trained?
0 Kommentare
Akzeptierte Antwort
Greg Heath
am 26 Jan. 2019
Bearbeitet: Greg Heath
am 26 Jan. 2019
The danger is OVERTRAINING an OVERFIT NET. There are several approaches.
1. PREVENT OVERFITTING the I-H-O net by having the number of training equations
Ntrneq = Ntrn* I
be no smaller than the number of unknown weights,
Nw = (I+1)*H+(H+1)*O
i.e., Ntrneq >= Nw
2. PREVENT OVERTRAINING by using a reasonable training goal
mse(target-output) <= 0.01 * mse(target-mean(target')' )
3. PREVENT OVERTRAINING by using a validation subset to implement
EARLY STOPPING
4. PREVENT OVERTRAINING by using TRAIN BR to implemet BAYESIAN
REGULARIZATION LEARNING
EARLY STOPPING (3) is automatic with the default TRAINLM.
Typically I try to implement 1-3 via minimizing H in addition to using EARLY STOPPING with the default training algoithm TRAINLM (2).
On rare equations I will use TRAINBR (4) when 1-3 do not yield satisfactory results.
Searching BOTH NEWSGROUP and ANSWERS using
Greg Ntrneq Nw
should yield zillions of examples
Hope this helps
THANK YOU FOR FORMALLY ACCEPTING MY ANSWER
GREG
0 Kommentare
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!