how do i calculate the accuracy of ANN?
2 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
amir
am 18 Sep. 2014
Beantwortet: Greg Heath
am 19 Sep. 2014
how do i calculate the accuracy of ANN?
i have 10 structures of neural network. i want to find the best structure that can predict as close as possible to the actual value.i use trainlm. so, my question is,how can i choose the best structures other than mean absolute percentage error (MAPE)?
0 Kommentare
Akzeptierte Antwort
Greg Heath
am 19 Sep. 2014
So this is regression, not classification. Use fitnet and vary the number of hidden nodes. For each value of hidden nodes, design Ntrials nets that differ by the set of random initial weights.
For examples, search the NEWSGROUP and ANSWERS using
greg fitnet Hub Ntrials
Use the training record tr to separate the trn/val/tst indices and performances.
Choose a performance index
MSE % scale dependent Useless unless compared with something
NMSE = MSE/mean(var(target',1)) % Normalized MSE; Typically 0 <= NMSE <= 1
R2 = 1-NMSE % fraction of target variance that is modeled by the net (Search Rsquare in Wikipedia)
Separately tabulate the trn/val/tst results. Choose the net that has the best validation performance. If there are several with performances that are not significantly different, choose one with the smallest number of hidden nodes.
Use the corresponding test performance to predict performance on unseen data if the net is deployed.
Searching the NEWSGROUP and ANSWERS with
greg ntrials
will provide examples.
Thank you for formally accepting my answer.
Greg
0 Kommentare
Weitere Antworten (1)
Greg Heath
am 18 Sep. 2014
Bearbeitet: Greg Heath
am 18 Sep. 2014
Insufficient info.
Are these classifiers? If so, are the class size ratios of the training data the same as that for unseen nontraining data?
There are four measures used for classifiers:
Bayesian Risk = sum(sum(Pi*Cij*pji)
Pi ith class prior probability
pij posterior probability estimate for ith class when the input is from class j
Cij misclassification cost (Cii=0)
ErrorRate = sum(Pi*Erri)
crossentropy1 = sum(ti*log(yi)) % Mutually exclusive classes
crossentropy2 = Entropy1 + sum((1-ti)*log(1-yi)% Nonmutually exclusive classes
Risk and crossentropy are best for design (Error rate is not continuous)
Error rate on nontraining data is best for selection.
Siehe auch
Kategorien
Mehr zu Define Shallow Neural Network Architectures finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!