Below is the probabilistic neural network for classification, but it is not working fine. Can someone help to improve it? especially how to recalculate training, validation and test performance.
3 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
davykol
am 20 Jan. 2015
Beantwortet: davykol
am 23 Jan. 2015
close all; clear; clc data = dlmread('iris.dat'); %data=dlmread('Dermatology.txt'); %data=[data(:,1) data(:,3:8) data(:,2)]; % inputs=[data(:,1) data(:,3:8)]; % targets=data(:,2); %inputs = thyroidInputs; %targets = thyroidTargets; B= data(:,1:end-1); B = B';
C = data(:,end);
n_class = max(C); class = 1:1:n_class; CC = zeros(n_class,size(data,1));
for k = 1:n_class
for i = 1:size(data,1)
if C(i)==k
CC(k,i) = 1;
else
CC(k,i) = 0;
end
end
end
inputs=data';
targets=CC; %T=ind2vec(C'); spread=1;
net = newpnn(data',targets,spread);
% % Choose Input and Output Pre/Post-Processing Functions % % For a list of all processing functions type: help nnprocess %net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'}; %net.outputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% simulate PNN on training data
output= sim(net,data');
% convert PNN outputs
%output = vec2ind(output);
view(net)
net.trainFcn = 'trainlm'; % Choose Plot Functions
% Choose a Performance Function % For a list of all performance functions type: help nnperformance net.performFcn = 'mse'; % Mean squared error
% For a list of all plot functions type: help nnplot net.plotFcns = {'plotperform','plottrainstate','ploterrhist', ... 'plotregression', 'plotfit'}; [ent,tr]=train(net,data',CC);
% Setup Division of Data for Training, Validation, Testing % For a list of all data division functions type: help nndivide net.divideFcn = 'dividerand'; % Divide data randomly net.divideMode = 'sample'; % Divide up every sample net.divideParam.trainRatio = 50/100; net.divideParam.valRatio = 25/100; net.divideParam.testRatio = 25/100;
nntraintool % plotperform(tr)
%testting the classifier % Test the Network outputs = net(inputs); errors = gsubtract(targets,outputs); performance = perform(net,targets,outputs)
% Recalculate Training, Validation and Test Performance trainTargets = targets .* tr.trainMask{1}; valTargets = targets .* tr.valMask{1}; testTargets = targets .* tr.testMask{1}; trainPerformance = perform(net,trainTargets,outputs) valPerformance = perform(net,valTargets,outputs) testPerformance = perform(net,testTargets,outputs)
% Plots % Uncomment these lines to enable various plots. figure, plotperform(tr) figure, plottrainstate(tr) figure, plotconfusion(targets,outputs)
1 Kommentar
Greg Heath
am 20 Jan. 2015
1. Reformat your code so that it runs when cut and pasted into the command line.
2. Use one of the MATLAB example data sets
help nndatasets
doc nndatasets
Hope this helps.
Greg
Akzeptierte Antwort
Greg Heath
am 21 Jan. 2015
NEWPNN is created, not trained.
help newpnn
doc newpnn
Try NEWRB, it trains itself.
help newrb
doc newrb
Search in NEWSGROUP and ANSWERS
greg newrb
Hope this helps
Thank you for formally accepting my answer
Greg
0 Kommentare
Weitere Antworten (3)
davykol
am 20 Jan. 2015
Bearbeitet: davykol
am 20 Jan. 2015
1 Kommentar
Greg Heath
am 21 Jan. 2015
Bearbeitet: Greg Heath
am 21 Jan. 2015
The computer cannot read the data matrix because the lines have different lengths
size(data) = ? size(input) = ? size(target) = ? Number of classes?
SreeHarish Muppirisetty
am 21 Jan. 2015
You can try with the following steps:
1. Try increasing the number of hidden nodes, in general.
2. Design 10 neural networks, with varying num of hidden nodes..and randomly divide your training, test and validation sets (10 sets each of them). You can even change Performance functions, to see which gives you a good classification.
3. Compute standard error of Mean (SEM), and select the Neural network closest to SEM.
HTH
0 Kommentare
Siehe auch
Kategorien
Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange
Produkte
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!