How can I set the parameters of the feedforward neural network?
3 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Xiaomin Li
am 4 Jul. 2017
Kommentiert: Xiaomin Li
am 13 Jul. 2017
How can I set the parameters of the feedforward neural network? How can I find the optimal number of hidden layers, number of nodes each layer? Thanks a lot!
0 Kommentare
Akzeptierte Antwort
Greg Heath
am 5 Jul. 2017
For run of the mill problems, you can use default settings except for
a. The number of hidden nodes (default is 10)
b. The initial weights and biases (the default, RANDOM, is best)
When
[ I N ] = size(input)
[ O N ] = size(target)
% Network topology is I - H - O
Nval = Ntst = round(0.15*N)
Ntrn = N - Nval - Ntst
% Number of training equations
Ntrneq = Ntrn*O % ~0.7*N*O
% No. of unknown weights and biases
% Nw = ( I + 1 )*H +( H + 1 )*O
Nw = O + (I + O + 1 )* H
% OVERFITTING (more unknowns than equations)
H > Hub = (Ntrneq-O)/(I+O+1)
To prevent overtraining an overfit net and impair its ability to perform well on nontraining data, one or a combination of the following can be implemented:
a. H <= Hub % Don't overfit!
b. Train with VALIDATION STOPPING to prevent poor
performance on the validation subset and other
(e.g., testing and unseen ) data
c. Use REGULARIZATION (see help/doc TRAINBR) to add
weighted sums of squared weights to the minimization function.
I tend to use VALIDATION STOPPING and a double loop approach to minimizing H by trial and error with random initial weights.
Search the NEWSGROUP and ANSWERS using
Hmin:dH:Hmax
Hope this helps
Thank you for formally accepting my answer
Greg
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu 支持向量机回归 finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!