the output size of the last layer doesn't match the number of classes
1 Ansicht (letzte 30 Tage)
Ältere Kommentare anzeigen
I want to build seven inputs, one output network. (11 classes; 1, 2, ... , 11)
I know I should use combined datastore or transformed datastore type for input.
Also, I made a mat file is already combined all of inputs and label in a cell.
I created 3D image arrays for each input and labels array for training.
The mat file consists of {1x8}; {1x (7 input arrays, 1 categorical label)} in the CombinedCell variable
CombinedCell =
1×8 cell array
{10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {[1]}
Then I load all files using datastore function by following the code.
function [trainingDatastore]=read_datastore(folder)
tempdatastore = datastore(folder,'ReadFcn',@load,'IncludeSubfolders',1,'Type','file','FileExtensions','.mat');
trainingDatastore = transform(tempdatastore,@rearrangeData);
function out = rearrangeData(ds)
out = ds.CombinedCell;
end
end
My network's structure code:
nClasses = 11;
inputSize = [10 9 640 1];
lgraph = layerGraph();
for ind=1:length(Multi_input)
Input = [image3dInputLayer(inputSize,"Name",Multi_input{ind})
convolution3dLayer([1 1 16],32,"Name",[Multi_input{ind} '_conv3d_1'],"Padding","same")
leakyReluLayer(0.01,"Name",[Multi_input{ind} '_leaky1'])
maxPooling3dLayer([1 1 12],"Name",[Multi_input{ind} '_pool1'],"Padding","same")
averagePooling3dLayer([1 1 64],"Name",[Multi_input{ind} '_avrpool1'],"Stride",[1 1 32])
convolution3dLayer([10 9 1],16,"Name",[Multi_input{ind} '_spa1'],"Padding","same")
leakyReluLayer(0.01,"Name",[Multi_input{ind} '_leaky2'])
maxPooling3dLayer([5 5 1],"Name",[Multi_input{ind} '_pool2'],"Padding","same")];
lgraph = addLayers(lgraph,Input);
clear Input
end
bottom = [
additionLayer(length(Multi_input),"Name","addition")
convolution3dLayer([1 1 5],16,"Name",'Combined_conv3d_1',"Stride",1)
fullyConnectedLayer(inputSize(1)*inputSize(2)*16,"Name","fc1")
fullyConnectedLayer(nClasses,"Name","fc2")
softmaxLayer("Name","softmax")
classificationLayer("Name","classoutput")];
lgraph = addLayers(lgraph,bottom);
for ind=1:length(Multi_input)
lgraph = connectLayers(lgraph,[Multi_input{ind} '_pool2'],['addition/in' num2str(ind)]);
end
[trainedNet, traininfo]= trainNetwork(trainingDatastore,lgraph,options);
Matlab show an error "the output size (11) of the last layer doesn't match the number of classes (1)
How should I fix the problem?
2 Kommentare
Walter Roberson
am 12 Jan. 2021
You do not show us how you are moving from the datastore returning a cell array, into the form needed by the layers ?
Antworten (1)
Walter Roberson
am 12 Jan. 2021
A 1x8 cell array at that point codes a single sample with a single data class. Your input must have at least one representative each from all 11 classes.
Siehe auch
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!