Neural Network Normalization code

2 Ansichten (letzte 30 Tage)
mustafa alnasser
mustafa alnasser am 31 Okt. 2014
Bearbeitet: Greg Heath am 1 Nov. 2014
Dear All;
i am using the standard NN code in matlab , i notice that without using data normalization , i get better results , Does this code normalization :
clc; clear; close all; [x1,TXT,RAW]=xlsread('Model.xlsx','VslVsg'); [t1,TXT2,RAW2]=xlsread('Model.xlsx','OUT'); x=x1'; t=t1'; size(x) size(t)
net = newpr(x,t,20); % view(net) net=init(net); [net,tr] = train(net,x,t); nntraintool
plotperform(tr)
%% Testing the Classifier % The trained neural network can now be tested with the testing samples % This will give us a sense of how well the network will do when applied % to data from the real world. % % The network outputs will be in the range 0 to 1, so we can use vec2ind % function to get the class indices as the position of the highest element % in each output vector.
testX = x(:,tr.testInd); testT = t(:,tr.testInd);
testY = net(testX); testIndices = vec2ind(testY)
%% % One measure of how well the neural network has fit the data is the % confusion plot. Here the confusion matrix is plotted across all samples. % % The confusion matrix shows the percentages of correct and incorrect % classifications. Correct classifications are the green squares on the % matrices diagonal. Incorrect classifications form the red squares. % % If the network has learned to classify properly, the percentages in the % red squares should be very small, indicating few misclassifications. % % If this is not the case then further training, or training a network % with more hidden neurons, would be advisable.
plotconfusion(testT,testY)
%% % Here are the overall percentages of correct and incorrect classification.
[c,cm,ind,per] = confusion(testT,testY) errors = gsubtract(testT,testY); fprintf('Percentage Correct Classification : %f%%\n', 100*(1-c)); fprintf('Percentage Incorrect Classification : %f%%\n', 100*c);
%% % Another measure of how well the neural network has fit data is the % receiver operating characteristic plot. This shows how the false % positive and true positive rates relate as the thresholding of outputs % is varied from 0 to 1. % % The farther left and up the line is, the fewer false positives need to % be accepted in order to get a high true positive rate. The best % classifiers will have a line going from the bottom left corner, to the % top left corner, to the top right corner, or close to that.
plotroc(testT,testY)

Akzeptierte Antwort

Greg Heath
Greg Heath am 1 Nov. 2014
Bearbeitet: Greg Heath am 1 Nov. 2014
I don't understand.
What is the purpose of this post?
If it is about normalization, the default for both input and target is mapminmax.
Very often the cause of bad performance is an unfortunate set of initial random weights. The counter is to make multiple designs with different initial rng states for each trial value of number of hidden nodes.
P.S. newpr is obsolete. It has been replaced by patternnet.
Hope this helps.
Thank you for formally accepting my answer
Greg

Weitere Antworten (0)

Kategorien

Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by