Neural network accuracy improves on retraining without weight reinitialisation
4 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Frederick Turner
am 28 Jan. 2018
Kommentiert: Frederick Turner
am 29 Jan. 2018
Apologies as a similar question has been asked before, but it was never resolved. I am trying to create a neural network for use in a regression problem using nftool/nntool. I find that, on the first training run, the network sometimes performs quite poorly, but that with subsequent training runs regression accuracy seems to increase (pretty much with each successive run), although the weights have not been reinitialised. Why does this happen (answers in terms of the error surface and backpropogation would be illustrative though I don't need that much detail)? When the weights are not reinitialised, does each training run in MATLAB somehow 'build on' the previous?
Thanks
0 Kommentare
Akzeptierte Antwort
Greg Heath
am 29 Jan. 2018
A net with former weights will continue training from those weights.
If you wish to reinitialize to get an alternate design use the function configure at the top of the loop.
I have posted zillions of examples in both the NEWSGROUP (now only available in comp.soft-sys.matlab) and ANSWERS
Try searching
greg configure
Hope this helps.
Thank you for formally accepting my answer
Greg
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Image Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!