By using Narnet to predict the future Price, we need determining the optimal lags to detemine the optimal hiddenlayesizes?

1 Ansicht (letzte 30 Tage)
To determine the hiddenlayesizes:
1/we apply a trial and error method by using lags=1 (default value), or
2/we must verify the optimal delays by using autocorrolation function,and next we use the obtained optimal lag in trial and error to determine the optimal number of hidden nodes?
Thanks in advance

Akzeptierte Antwort

Greg Heath
Greg Heath am 11 Mai 2017
Bearbeitet: Greg Heath am 11 Mai 2017
1. For unbiased prediction use divideblock so that the delays and weights are not determined by nontraining( i.e., validation and test ) data.
2. It is worthwhile to plot the trn/val/tst data in three colors to view the data division. Beware if the training subset doesn't look like it could be used to predict the nontraining data.
3. Estimate unbiased values for delays by determining the significant lags of the training subset autocorrelation function.
4. Given a subset of the significant lags to use for delays, you can determine the maximum number of hidden nodes so that the number of unknown weights Nw, does not exceed the number of training equations Ntrneq.
5. By trial and error determine the smallest number of hidden nodes that will yield a sufficiently low error rate for the training and validation subsets. If you exceed the max number of hidden nodes determined in 4, you have to beware of the overtraining/overfitting phenomenon (More unknowns than equations).
6. I tend to use 10 or more trials of random initial weights for each setting of hidden nodes.
7. I have zillions of posts in both the NEWSGROUP and ANSWERS. The posts in the NEWSGROUP tend to be more tutorial in nature.
8. Finally, a direct answer to your question:
No. All you have to do is find a good combination
of lags and hidden nodes that will yield a good
unbiased prediction.
Since I could not find a good tutorial, I made
up my own.
Hope this helps.
Greg
  2 Kommentare
coqui
coqui am 11 Mai 2017
Ntrn=2000,Nval=300,Ntst=600.
Different significant lags are found [1,2,3,20,21,25,29,30,35,36,39]. Nw=(MXFD*o+1)*H+(H+1)*O
To determine the maximum number of hidden nodes so that the number of unknown weights Nw, does not exceed the number of training equations Ntrneq, What is the maximum number of hidden nodes must be used in this case?
Greg Heath
Greg Heath am 12 Mai 2017
0. NO MATTER WHAT YOU DO, THE IMPORTANT THING TO REMEMBER IS THAT YOU ARE DESIGNING A NET TO PERFORM WELL ON NONTRAINING DATA WITH A REASONABLE LEVEL OF CONFIDENCE!
1. ALWAYS take a good look at the three colored (trn/val/tst) plots of
a. Input x
b. autocorrelation of x
2. Does it make sense to rigidly apply my suggestions to your data?
3. If not, does it make sense to change your datadivision ratios?
4. I don't know what happens if there are gaps in the delays that are used. In particular,
a. Are there weights with value 0 assigned to the delays that are skipped
over and should be considered w.r.t. Nw and Hub?
b. The MXFD in the expression for Nw assumes a.
c. OR are there just no weights and MXFD should be replaced by NFD?
5. If using lags 1:3 is not sufficient, I would definitely
a. Take a good look at the autocorrelation plot.
b. Consider decreasing the significance level to consider lags 4,5,...
6. I would also consider using H > Hub and mitigating overtraining the overfit net via
a. Validation Stopping
b. Bayesian Regularization (e.g., TRAINBR)
NOTE: I have found articles where better results were achieved when a and b are used simultaneously. However, the last time I looked, TRAINBR IMPOSES Nval = 0!!!
7. The bottom line is that there are no hard and fast rules except what I stated in number 0.
8. Please let us know what you did and how it turned out.
Good Luck,
Greg

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Kategorien

Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by