Neural network performance evaluation????

2 Ansichten (letzte 30 Tage)
Daud
Daud am 25 Dez. 2012
for evaluating NN performance for a given number of trail or retrain which approach is right and why?????
for trail=1:100
net=newff(....);
[net,tr,Y,E,Pf,Af] = train(...);
......;
end
OR
net=newff(....);
for trail=1:100
[net,tr,Y,E,Pf,Af] = train(...);
........;
end
Note: i am getting decent result for both approach; but the later giving me best result.

Akzeptierte Antwort

Greg Heath
Greg Heath am 1 Jan. 2013
Thank you for formally accepting my answer!

Weitere Antworten (1)

Greg Heath
Greg Heath am 27 Dez. 2012
The first example is the correct one because it containss 100 random weight initializations. Therefore each net is a valid independent result.
The 2nd example just keeps training the same net more and more.
What, exactly, do you mean by decent results?
Is this regression or classification?
Are you using validation stopping?
How many acceptable solutions out of 100?
If regression, what are the means and standard deviations of the training, validation and testing NORMALIZED (with average target variance) mean-square-error?
I usually shoot for (but don't always get) NMSEtrn <= 0.01
For an I-H-O net
Ntrneq = prod(size(ttrn)) % Ntrn*O = No. of training equations
Nw = (I+1)*H +(H+1)*O % No. of unknown weights
NMSEtrn = sse(trn-ytrn)/(Ntrneq-Nw)/mean(var(ttrn',0))
NMSEi = mse(yi-ti)/mean(var(ti',1)) for i = val and test
I have posted many example in NEWSGROUP and ANSWERS. Try searching on
heath newff Ntrials
Hope this helps.
Thank you for formally accepting my answer.
Greg
  8 Kommentare
Daud
Daud am 1 Jan. 2013
Thanks for ur answer and correcting my spelling "Trial"; But still i can't incorporate the facts u mentioned; by the way the total data set is divided in to test;train and val sets. And the recognition rate mentioned above is the overall recognition rate(train; val and test).
Why should i concern about over-fitting since i am using validation stop?
Ok; Greg i want a query after the trial "1" in 2nd approach suppose the weights (initial) are w1;w2...wn. Now in second "2" trial Are the weights (initial) changed? or same as trial number "1" w1;w2...;wn.
If weights (initial) are same in each trial i am totally agree with u; but if not........ confused.
Daud
Daud am 1 Jan. 2013
Greg i am posting my full code here: plz check it out
clc
close all
load Input_n
run target_00
order=randperm(size(input_all,2));
input_all=input_all(:,order);
Targets=Targets(:,order);
n_trial=100;
c_tr{1,n_trial}= [];cm_tr{1,n_trial}=[];ind_tr{1,n_trial}=[];per_tr{1,n_trial}=[];
c_ts{1,n_trial}= [];cm_ts{1,n_trial}=[];ind_ts{1,n_trial}=[];per_ts{1,n_trial}=[];
c_val{1,n_trial}= [];cm_val{1,n_trial}=[];ind_val{1,n_trial}=[];per_val{1,n_trial}=[];
%c_val_ovrl={1,100};cm_val_ovrl={1,100};ind_val_ovrl={1,100};per_val_ovrl={1,100};
c_ovrl{1,n_trial}= [];cm_ovrl{1,n_trial}= [];ind_ovrl{1,n_trial}= [];per_ovrl{1,n_trial}= [];
%c_tr_ovrl={1,100};cm_tr_ovrl={1,100};ind_tr_ovrl={1,100};per_tr_ovrl={1,100};
tr_info{1,n_trial} = [];
tr_net{1,n_trial}= [] ;
net.inputs{1}.processFcns = {'mapstd'};
net=newff(input_all,Targets,7,{'tansig','tansig'},'trainscg','learngdm','msereg');
%training parameters
net.trainParam.epochs=1000;
net.trainParam.goal=0;
net.trainParam.max_fail=6;
%Division parameters
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 20/100;
net.divideParam.testRatio = 10/100;
for i=1:100
close all
%net = init(net);
[net,tr,Y,E] = train(net,input_all,Targets);
info=trainscg('info');
tr_info{i} = tr;
tr_net{i} = net;
outputs_test=sim(net,input_all(:,tr.testInd));
outputs_val=sim(net,input_all(:,tr.valInd));
outputs_ovrl = sim(net,input_all);
%outputs=sim(net,input_test);
%[m,b,r] = postreg(outputs_test,Targets(:,tr.testInd))
%recog=sim(net,input_test);
%a=compet(recog)
[c_tr{i},cm_tr{i},ind_tr{i},per_tr{i}] = confusion(Targets(:,tr.trainInd),Y);
[c_val{i},cm_val{i},ind_val{i},per_val{i}] = confusion(Targets(:,tr.valInd),outputs_val);
[c_ts{i},cm_ts{i},ind_ts{i},per_ts{i}] = confusion(Targets(:,tr.testInd),outputs_test);
[c_ovrl{i},cm_ovrl{i},ind_ovrl{i},per_ovrl{i}] = confusion(Targets,outputs_ovrl);
%plotperf(tr)
%grid on
%plotconfusion(Targets(:,tr.trainInd),Y,'Training',Targets(:,tr.valInd),outputs_val,'Validation',Targets(:,tr.testInd),outputs_test,'Test')
%pause
end
%Result evaluation
Avg_recg_rt_ovrl = 100 - mean(cell2mat(c_ovrl));
Avg_recg_rt_tr = 100 - mean(cell2mat(c_tr));
Avg_recg_rt_ts = 100 - mean(cell2mat(c_ts));
Avg_recg_rt_val = 100 - mean(cell2mat(c_val));
[min_err trail_num] = min(cell2mat(c_ovrl));
best_recg = 100 - min_err;

Melden Sie sich an, um zu kommentieren.

Kategorien

Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by