How can i predict data by using neural network from input after fitting the data??
2 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Atiyo Banerjee
am 26 Jun. 2014
Kommentiert: sidra muqaddas
am 26 Okt. 2016
I used Neural Network fitting tool for training my data and got outputs for each target that i supplied to the network. Those outputs are well within the error range and give a good fit for the network. But, now i want to predict output based on input samples not included within the data set that i previously provided to the nnftool for getting the outputs. Please tell me how i can do that? The input samples are withing the training set range.
3 Kommentare
sidra muqaddas
am 26 Okt. 2016
how to predict output from a new input,after you have done with the training.(using code not nntoolbox variables)
sidra muqaddas
am 26 Okt. 2016
x = [0 1 0; 0 1 0; 1 1 0; 1 1 1; 0 1 1; 1 1 1; 0 1 1; 1 1 0]; % three samples from input training data
t = [0 0 1; 1 0 0 ; 0 1 0; 0 0 0;0 0 0;0 0 0]; %three samples from target training data
[ni N]=size(x); % ni= no of input neurons
[no N]=size(t); %no= no of output neurons
nh=8; % no of hidden neurons in hidden layer
wih = 0.01*randn(nh,ni+1); %weight matrix (iput to hidden layer)
who = 0.01*randn(no,nh+1); %weight matrix (hidden to output layer)
c = 0;
while(c < 1000)
c = c+1;
for i=1:N
for j = 1:nh
netj(j) = wih(j,1:end-1)*x(:,i)+wih(j,end);
outj(j) = tansig(netj(j));
end
for k = 1:no
netk(k) = who(k,1:end-1)*outj' + who(k,end);
outk(k) = 1./(1+exp(-netk(k)));
delk(k) = outk(k)*(1-outk(k))*(t(k,i)-outk(k));
end
%back propagation
for j = 1:nh
s=0;
for k = 1:no
s = s + who(k,j)*delk(k);
end
delj(j) = outj(j)*(1-outj(j))*s;
end
for k = 1:no
for l = 1:nh
who(k,l) = who(k,l)+.5*delk(k)*outj(l);
end
who(k,l+1) = who(k,l+1)+1*delk(k)*1;
end
for j = 1:nh
for ii = 1:ni
wih(j,ii) = wih(j,ii)+.5*delj(j)*x(ii,i);
end
wih(j,ii+1) = wih(j,ii+1)+1*delj(j)*1;
end
end
end
h = tansig(wih*[x;ones(1,N)]);
y = logsig(who*[h;ones(1,N)]); y=round(y); e = t-y; % new iput to the network csr=[0 1 0 0 0 0 1 0]; % current sensor reading
Akzeptierte Antwort
Greg Heath
am 29 Jun. 2014
Incorrect understanding:
Generalization: Ability to perform well on nontraining data
Overfitting: Number of training equations, Ntrneq, not being sufficiently larger than the number of unknown weights, Nw, can be a cause of DECREASED generalization.
Mitigation: Either increase Ndof and/or use validation stopping(default) and/or use regularization (e.g., TRAINBR)
Insufficient information:
size(input) = [ I N ] = [ ? ? ]
size(target) = [ O N ] = [ ? ? ]
default number of training examples Ntrn = N-2*round(0.15*N) = ?
number of training equations Ntrneq = Ntrn*O
reference mean-square errors
MSEtrn00 = mean(var(trntarget',1)) % Biased
MSEtrn00a = mean(var(trntarget',0))% DOF adjusted
MSEval00 = mean(var(valtarget',1)) % Unbiased
MSEtst00 = mean(var(tsttarget',1)) % Unbiased
number of hidden nodes, H = ?
number of unknown weights Nw = (I+1)*H+(H+1)*O = ?
number of estimation degrees of freedom Ndof = Ntrneq-Nw = ?
normalized-mean-squuare-errors
SSEtrn = sse(trntarget-trnoutput)
MSEtrn = SSEtrn/Ntrneq % mse(trntarget-trnoutput)
MSEtrna = SSEtrn/Ndof
NMSEtrn = MSEtrn/MSEtrn00
NMSEtrna = MSEtrna/MSEtrn00a
NMSEval = MSEval/MSEval00
NMSEtst = MSEtst/MSEtst00
Weitere Antworten (1)
Greg Heath
am 28 Jun. 2014
newoutput = net(newinput)
THank you for formally accepting my answer
Greg
4 Kommentare
Greg Heath
am 26 Okt. 2016
You can always superimpose output plots (red) over target plots (blue) to obtain a better understanding of what causes errors.
Siehe auch
Kategorien
Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange
Produkte
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!