How can I get a confusion matrix after I train my AI using DeepNetworkDesigner?

 Akzeptierte Antwort

Ganesh Gudipati
Ganesh Gudipati am 12 Apr. 2022
Hi,
The confusion matrix can be built using the MATLAB function plotconfusion.
For more information refer plotconfusion.

6 Kommentare

Thank you for the replay
I have tried plotconfusion, but since I use the deep network designer app to train the AI when the training is finished I am only getting two variables in my work space 1.trainedNetwork_1 and 2.infostruct_1 I dont know which variables to use in the plotconfusion
and when I try plotconfusion in the command window it says variables not found.
Hi,
Take some test data and test the trained network using predict function. Now using this results you can plot a confusion matrix.
Hope it helps.
Create and Train a Deep Learning Model
Script for creating and training a deep learning network with the following properties:
Number of layers: 25
Number of connections: 24
Training setup file: C:\Users\ph20aap\OneDrive - University of Hertfordshire\matlab_project_saves\faces_0001_64_gpu\faces_0001_64_gpu.mat
Run this script to create the network layers, import training and validation data, and train the network. The network layers are stored in the workspace variable layers. The trained network is stored in the workspace variable net.
To learn more, see Generate MATLAB Code From Deep Network Designer.
Auto-generated by MATLAB on 11-Apr-2022 19:53:23
Load Initial Parameters
Load parameters for network initialization. For transfer learning, the network initialization parameters are the parameters of the initial pretrained network.
trainingSetup = load("C:\Users\ph20aap\OneDrive - University of Hertfordshire\matlab_project_saves\faces_0001_64_gpu\faces_0001_64_gpu.mat");
Import Data
Import training and validation data.
imdsTrain = imageDatastore("C:\Users\ph20aap\OneDrive - University of Hertfordshire\Data sets\FACES\Faces","IncludeSubfolders",true,"LabelSource","foldernames");
[imdsTrain, imdsValidation] = splitEachLabel(imdsTrain,0.75,"randomized");
% Resize the images to match the network input layer.
augimdsTrain = augmentedImageDatastore([227 227 3],imdsTrain);
augimdsValidation = augmentedImageDatastore([227 227 3],imdsValidation);
Set Training Options
Specify options to use when training.
opts = trainingOptions("sgdm",...
"ExecutionEnvironment","gpu",...
"InitialLearnRate",0.0001,...
"MaxEpochs",50,...
"MiniBatchSize",64,...
"Shuffle","every-epoch",...
"ValidationFrequency",30,...
"Plots","training-progress",...
"ValidationData",augimdsValidation);
Create Array of Layers
layers = [
imageInputLayer([227 227 3],"Name","data","Mean",trainingSetup.data.Mean)
convolution2dLayer([11 11],96,"Name","conv1","BiasLearnRateFactor",2,"Stride",[4 4],"Bias",trainingSetup.conv1.Bias,"Weights",trainingSetup.conv1.Weights)
reluLayer("Name","relu1")
crossChannelNormalizationLayer(5,"Name","norm1","K",1)
maxPooling2dLayer([3 3],"Name","pool1","Stride",[2 2])
groupedConvolution2dLayer([5 5],128,2,"Name","conv2","BiasLearnRateFactor",2,"Padding",[2 2 2 2],"Bias",trainingSetup.conv2.Bias,"Weights",trainingSetup.conv2.Weights)
reluLayer("Name","relu2")
crossChannelNormalizationLayer(5,"Name","norm2","K",1)
maxPooling2dLayer([3 3],"Name","pool2","Stride",[2 2])
convolution2dLayer([3 3],384,"Name","conv3","BiasLearnRateFactor",2,"Padding",[1 1 1 1],"Bias",trainingSetup.conv3.Bias,"Weights",trainingSetup.conv3.Weights)
reluLayer("Name","relu3")
groupedConvolution2dLayer([3 3],192,2,"Name","conv4","BiasLearnRateFactor",2,"Padding",[1 1 1 1],"Bias",trainingSetup.conv4.Bias,"Weights",trainingSetup.conv4.Weights)
reluLayer("Name","relu4")
groupedConvolution2dLayer([3 3],128,2,"Name","conv5","BiasLearnRateFactor",2,"Padding",[1 1 1 1],"Bias",trainingSetup.conv5.Bias,"Weights",trainingSetup.conv5.Weights)
reluLayer("Name","relu5")
maxPooling2dLayer([3 3],"Name","pool5","Stride",[2 2])
fullyConnectedLayer(4096,"Name","fc6","BiasLearnRateFactor",2,"Bias",trainingSetup.fc6.Bias,"Weights",trainingSetup.fc6.Weights)
reluLayer("Name","relu6")
dropoutLayer(0.5,"Name","drop6")
fullyConnectedLayer(4096,"Name","fc7","BiasLearnRateFactor",2,"Bias",trainingSetup.fc7.Bias,"Weights",trainingSetup.fc7.Weights)
reluLayer("Name","relu7")
dropoutLayer(0.5,"Name","drop7")
fullyConnectedLayer(6,"Name","fc")
softmaxLayer("Name","prob")
classificationLayer("Name","classoutput")];
Train Network
Train the network using the specified options and training data.
[net, traininfo] = trainNetwork(augimdsTrain,layers,opts);
this is the script i genarated using DeepNetworkDesigner, I tried y=predict(net,augimdsValidation);
Unrecognized function or variable 'net'.
and it is giving me an error as I am new to matlab I am really struggiling to figure out the issue here.
Export the network trained in Deep Network Designer to Simulink, on the Training tab, click Export > Export to Simulink.
Deep Network Designer saves the trained network as a MAT-file and generates Simulink blocks representing the trained network. The blocks generated depend on the type of network trained.
Now coonect the predict block from simulink library. Hope this solved the issue.
I am usining the matlab version R2021a as my university is not upgraded to the 2022 version so I will have to use the 2021a.... In 2021a there is only 2 options in export that is 1. Export to work space 2.Generate Live script. The genarated live script is the one I have posted above. Is there any other way to get the Confusion matrix from the Trained networks i have?
As I could not solve the issue in DeepNetworkDesigner, I found a temperory solution, that is create the network using DND and export the code before training it, and copy paste the code in Experiment manager app and do the training there.. Experiment manage will produce confusion matrix for you.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Produkte

Version

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by