MATLAB gives me different value of output every time I train a neural network, why?
5 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
mun1013
am 1 Jul. 2015
Kommentiert: mun1013
am 2 Jul. 2015
I was doing multilayer neural network. Input data (3 input data and 150 samples) - 3x150 target - 1x150
I did not specify the weight and bias, is it the reason to return different value of output every time I train the neural network?
0 Kommentare
Akzeptierte Antwort
Greg Heath
am 2 Jul. 2015
The default data division and weight initialization are both random.
To reproduce a design you have to know the initial state of the RNG before it is both configured with initial weights and divided into training, validation and testing subsets.
When designing multiple nets in a double for loop (creation in the outer loop and training in the inner loop), you only have to initialize the RNG once: before the first loop. The RNG changes its state every time it is called. Therefore, for reproducibility, record the RNG state at the beginning of the inner loop.
Exactly when the RNG is called differs for the different generation of designs. For special cases of the obsolete NEWFF family (e.g., NEWFIT, NEWPR and NEWFF), weights are initialized when the nets are created. For special cases of the current FEEDFORWARDNET family, (e.g., FITNET, PATTERNNET and FEEDFORWARDNET), weights can be initialized explicitly by the CONFIGURE function. Otherwise, they will be automatically initialiized by the function TRAIN.
When I find out exactly where the data is divided, I will post in both the NEWSGROUP and ANSWERS.
Hope this helps.
Thank you for formally accepting my answer
Greg
Weitere Antworten (1)
Walter Roberson
am 1 Jul. 2015
The weights are initialized randomly unless you specifically initialize them.
Siehe auch
Kategorien
Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!