neural network test with a new data set
13 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Young Tae
am 26 Feb. 2013
Kommentiert: Joana
am 8 Mär. 2020
Dear Matlab experts. Actaully, I'm not familiar with neural network analysis. I want to forecast outdoor air temperature with input set(ground temp, cloud, relative humidity). The training/validation/testing is okay with the input data set(1X2877) and target data(1X2877). However, I have trapped to evaluate the network with the new data set (1X960) (same input style). Would you light on for me? I'm lost my way to resolve the issue. I apprecaite your valuable time to concern on this issue.
===here is my code===
% Solve an Autoregression Problem with External Input with a NARX Neural Network % Script generated by NTSTOOL % Created Fri Feb 22 15:22:18 EST 2013 % % This script assumes these variables are defined: % % JULYTH - input time series. % JULYE - feedback time series.
%This is 1X2877 matrix data has [a;b;c] for each
inputSeries = tonndata(JYTH,false,false);
% This is 1X2877 matrix data has target output targetSeries = tonndata(JYE,false,false);
% Create a Nonlinear Autoregressive Network with External Input
inputDelays = 1:2;
feedbackDelays = 1:2;
hiddenLayerSize = 10;
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize);
% Choose Input and Feedback Pre/Post-Processing Functions % Settings for feedback input are automatically applied to feedback output % For a list of all processing functions type: help nnprocess % Customize input parameters at: net.inputs{i}.processParam % Customize output parameters at: net.outputs{i}.processParam
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};
%net.inputs{3}.processFcns = {'removeconstantrows','mapminmax'};
% Prepare the Data for Training and Simulation % The function PREPARETS prepares timeseries data for a particular network, % shifting time by the minimum amount to fill input states and layer states. % Using PREPARETS allows you to keep your original time series data unchanged, while % easily customizing it for networks with differing numbers of delays, with % open loop or closed loop feedback modes.
[inputs,inputStates,layerStates,targets] = preparets(net,inputSeries,{},targetSeries);
% Setup Division of Data for Training, Validation, Testing % The function DIVIDERAND randomly assigns target values to training, % validation and test sets during training. % For a list of all data division functions type: help nndivide
net.divideFcn = 'dividerand'; % Divide data randomly
% The property DIVIDEMODE set to TIMESTEP means that targets are divided % into training, validation and test sets according to timesteps. % For a list of data division modes type: help nntype_data_division_mode
net.divideMode = 'value'; % Divide up every value
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Choose a Training Function % For a list of all training functions type: help nntrain % Customize training parameters at: net.trainParam
net.trainFcn = 'trainlm'; % Levenberg-Marquardt
% Choose a Performance Function % For a list of all performance functions type: help nnperformance % Customize performance parameters at: net.performParam
net.performFcn = 'mse'; % Mean squared error
% Choose Plot Functions
% For a list of all plot functions type: help nnplot % Customize plot parameters at: net.plotParam
net.plotFcns = {'plotperform','plottrainstate','plotresponse', ... 'ploterrcorr', 'plotinerrcorr'};
% Train the Network
[net,tr] = train(net,inputs,targets,inputStates,layerStates);
% Test the Network
outputs = net(inputs,inputStates,layerStates);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)
% Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(targets,tr.trainMask);
valTargets = gmultiply(targets,tr.valMask);
testTargets = gmultiply(targets,tr.testMask);
trainPerformance = perform(net,trainTargets,outputs)
valPerformance = perform(net,valTargets,outputs)
testPerformance = perform(net,testTargets,outputs)
% View the Network
view(net)
_% From this part I want to run a new test or forecast with new inputs % This is a new inputs 1X960. The maxrix has the same structure for the % testing [a;b;c]
inputSeries2 = tonndata(AUGTH,false,false);
[inputs2,inputStates2,layerStates2,targets2] = preparets(net,inputSeries2);
% When I want to generate a new output from the network all "output2"(1X960) has % NaN. I suspect that "inputStates2" has NaN value its second row. Would % you let me know how I resolve the issue and get the new output2?
outputs2 = net(inputs2,inputStates2,layerStates2);_
0 Kommentare
Akzeptierte Antwort
Greg Heath
am 26 Feb. 2013
I want to forecast outdoor air temperature with input set(ground temp, cloud, relative humidity). The training/validation/testing is okay with the input data set(1X2877) and target data(1X2877).
What, exactly, does "okay" mean?... What are the val and test R^2 values?
How can you have a one dimensional input data set when you have 3 input variables?
This script assumes these variables are defined: JULYTH - input time series. JULYE - feedback time series. This is 1X2877 matrix data has [a;b;c] for each
for each series? That makes no sense.
If you have 3 inputs the input matrix dimension should be [ 3 2877]!
Unless it is cell data and not matrix data...
To make sure, type the 4 commands
iscell( [ JYTH ; JYE ] )
[ I N ] = size(JYTH)
[O N] = size(JYE)
whos
inputSeries = tonndata(JYTH,false,false); This is 1X2877 matrix data has target output targetSeries = tonndata(JYE,false,false);
You seem to be confused. tonndata produces cell data!
whos inputSeries targetSeries
Create a Nonlinear Autoregressive Network with External Input inputDelays = 1:2; feedbackDelays = 1:2; hiddenLayerSize = 10;
How do you know these are good inputs? To be sure, calculate the significant lags for the output autocorrelation function AND the input/output crosscorrelation function
Choose Input and Feedback Pre/Post-Processing Functions... net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'}; net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'}; net.inputs{3}.processFcns = {'removeconstantrows','mapminmax'};
Delete these statements. These are defaults.
Prepare the Data for Training and Simulation [inputs,inputStates,layerStates,targets] = preparets(net,inputSeries,{},targetSeries);
Double check dimensions and data classes:
whos inputSeries targetSeries inputs inputStates layerStates targets
Setup Division of Data for Training, Validation, Testing The function DIVIDERAND randomly assigns target values to training, % validation and test sets during training...
net.divideFcn = 'dividerand'; % Divide data randomly
NO, NO, NO!
RANDOM DIVISION DESTROYS AUTO AND CROSS CORRELATIONS
USE 'divideblock'
Hard to believe previous val and test results are "okay" if you used 'dividerand'
The property DIVIDEMODE set to TIMESTEP means that targets are divided into training, validation and test sets ... net.divideParam.trainRatio = 70/100; net.divideParam.valRatio = 15/100; net.divideParam.testRatio = 15/100;
These are defaults. Delete these statements unless you want to change the percentages.
Choose a Training Function ... net.trainFcn = 'trainlm'; % Levenberg-Marquardt Choose a Performance Function ... mse
These are defaults! Delete these statements unless you want to use other choices.
Choose Plot Functions... net.plotFcns = {'plotperform','plottrainstate','plotresponse', ... 'ploterrcorr', 'plotinerrcorr'};
These are part of a seven plot default list. Delete and you will get these PLUS ploterrhist and plot regression
Train the Network
[net,tr] = train(net,inputs,targets,inputStates,layerStates);
You can delete most of the the following statements. A detailed performance summary is already contained in the training structure tr. Type the command
tr = tr
Test the Network outputs = net(inputs,inputStates,layerStates); errors = gsubtract(targets,outputs); performance = perform(net,targets,outputs) Recalculate Training, Validation and Test Performance trainTargets = gmultiply(targets,tr.trainMask); valTargets = gmultiply(targets,tr.valMask); testTargets = gmultiply(targets,tr.testMask); trainPerformance = perform(net,trainTargets,outputs) valPerformance = perform(net,valTargets,outputs) testPerformance = perform(net,testTargets,outputs)
Most important are the test and validation results. If they are not very small compared to mean(var(testtarget',1)) and mean(var(valtarget',1)), respectively, then you need to improve your design by using a better choice of delays, or checking out other random weight initializations or maybe even changing the number of hidden nodes from H=10.
Next you should close the loop and test the closed loop trn, val and tst results.
Now is a reasonable time to try new data on the closed loopnet
View the Network view(net)
Probably should view the net right after the train statement
From this part I want to run a new test or forecast with new inputs This is a new inputs 1X960. The maxrix has the same structure for the testing [a;b;c]
inputSeries2 = tonndata(AUGTH,false,false);
[inputs2,inputStates2,layerStates2,targets2] = preparets(net,inputSeries2);
How can you generate layerstates2 and targets2 without a targetSeries2?
You have to use a closed loop net.
ALWAYS check the preparets I/O
whos inputSeries2 inputs2 inputStates2 layerStates2 targets2
When I want to generate a new output from the network all"output2"(1X960) has NaN. I suspect that "inputStates2" has NaN value its second row. Would you let me know how I resolve the
See above
Hope this helps.
Thank you for formally accepting my answer
Greg
4 Kommentare
Greg Heath
am 16 Okt. 2013
If the new data immediately follows the data used to design and test the net, the following syntax should have been used
[ net tr Ys Es Xsf Asf ] =train(net,Xs,Ts,Xi,Ai);
Xinew = Xsf; Ainew = Asf;
Ysnew = net(Xsnew,Xinew,Ainew);
Otherwise
Xinew = Xnew(:,1:d); Xsnew = Xnew(:,d+1:end)
but Ainew is not known.
I would try the mean of the previously used test target data rather than use zeros. Perhaps several designs using values in the interval [mean-stdv,mean+stdv] would be useful.
Joana
am 8 Mär. 2020
Hi Greg
I am trying to test a trined NN model in MATLAB but it's giving the wrong output. I have trained the model on input of order 223x448 with labels as 223x1. And tested on the another set of order 114x448. where 114 comprises of two classes,each of the 57 labels for class 1 and clas 2. When i test the model it classify each class as one class; class 1 or class 2; and gives the confusion matrix of [57 57 ; 0 0]. Can you please help where i am doing this wrong.
The code to train the model:
trainFcn = 'trainscg'; % Scaled conjugate gradient backpropagation.
hiddenLayerSize = [10 10 10];
net = patternnet(hiddenLayerSize, trainFcn);
net.input.processFcns = {'removeconstantrows','mapminmax'};
net.performFcn = 'crossentropy'; % Cross-Entropy
net.plotFcns = {'plotconfusion'};
[net,tr] = train(net,train_f,train_L);
ANd this is how i test:
y = net(test_f);
e = gsubtract(test_L,y);
performance = perform(net,test_L,y);
tind = vec2ind(test_L);
yind = vec2ind(y);
percentErrors = sum(tind ~= yind)/numel(tind);
% Testing accuracy
[values,pred_ind]=max(y,[],1);
[~,actual_ind]=max(test_L,[],1);
accuracy=sum(pred_ind==actual_ind)/size(test_f,2)*100
fprintf('\n Classification Accuracy (NN): %g %%',accuracy);
[~,con]=confusion((test_L),y);
figure, plotconfusion(test_L,y)
Any help will be highly appreciated.
Weitere Antworten (3)
Mohan
am 26 Feb. 2013
The testing is usually done as follows :
a = sim(net,testInput');
where net is the narx net in your program,
testInput is the new data set.
look for "sim" in Matlab help
1 Kommentar
abdulkader helwan
am 19 Dez. 2013
Hello..i have created a backpropagation neural network in matlab for prediction of heart attack and i have trained it on a dataset and it worked out and gave the desired output..the problem is that i don't know hoe to test it then...if anyone can help plz don't hesitate this is my code for training network clear all close all clc case_number=151; PATTERNS = []; dataset = xlsread('dataset.xlsx','sheet1'); [row col] = size(dataset); PATTERNS = [ dataset];
% Desired Output Code D1=[1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]; D2=[0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]; D3=[0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]; %******************************************************** dis.out=[D1;D2 ;D3 ];
[g,h]=size(PATTERNS); [m,h]=size(dis.out); % CREATING AND INITIATING THE NETWORK net=newff(minmax(PATTERNS),[14 3],{'logsig','logsig'},'traingdx') net = init(net); net.LW{2,1} = net.LW{2,1}*0.01; %net.b{2} = net.b{2}*0.01; % TRAINING THE NETWORK net.trainParam.goal = 0.001; % Sum-squared error goal. net.trainParam.lr = 0.01; % Learning Rate. net.trainParam.alpha = 0.5; net.trainParam.show = 100; % Frequency of progress displays (in epochs). net.trainParam.epochs =1000;% Maximum number of epochs to train. net.trainParam.mc = 0.5; % Momentum Factor. k=case_number
for k=1:41
[net,tr] = train(net,PATTERNS,D1); % Normal....
end
actout.normal=sim(net,PATTERNS);
actout.normal
norm.test
%
for k=42:97
[net,tr] = train(net,PATTERNS,D2); % Abnormal....
end
act.abnormal=sim(net,PATTERNS);
act.abnormal
for k=98:151
[net,tr] = train(net,PATTERNS,D3); % Severe....
end
act.Severe=sim(net,PATTERNS);
act.Severe
0 Kommentare
Ankur Dutt
am 7 Mai 2015
Hello experts! I am not familiar with neural networks, i want to give database to inputs and targets of neural network, i also made .mat files for both database having names I(inputs) and H(outputs) but i found an error that I is not defined
0 Kommentare
Siehe auch
Kategorien
Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!