Filter löschen
Filter löschen

I'm having trouble with convolution1dLayer

43 Ansichten (letzte 30 Tage)
nagihan yagmur
nagihan yagmur am 26 Mär. 2023
Kommentiert: Matt J am 6 Apr. 2023
layers = [
featureInputLayer(24)
convolution1dLayer(5, 32, 'Padding', 'same')
batchNormalizationLayer
reluLayer
maxPooling1dLayer(2, 'Stride', 2)
convolution1dLayer(5, 64, 'Padding', 'same')
batchNormalizationLayer
reluLayer
maxPooling1dLayer(2, 'Stride', 2)
convolution1dLayer(5, 128, 'Padding', 'same')
batchNormalizationLayer
reluLayer
maxPooling1dLayer(2, 'Stride', 2)
dropoutLayer(0.5)
fullyConnectedLayer(5)
softmaxLayer
classificationLayer];
options = trainingOptions('adam', ...
'MaxEpochs', 20, ...
'MiniBatchSize', 128, ...
'ValidationData', {XVal, YVal}, ...
'ValidationFrequency', 50, ...
'Shuffle', 'every-epoch', ...
'Verbose', false, ...
'Plots', 'training-progress');
%Xtrain = 5000x24 Ytrain = 5000x1 Xtest = 5000x24 Ytest=50000x1
net = trainNetwork(XTrain, YTrain, layers, options);
% Doğruluk oranını hesapla
YPred = classify(net, XTest);
accuracy = sum(YPred == YTest) / numel(YTest);
fprintf('Doğruluk oranı: %0.2f%%\n', 100*accuracy);
I have a dataset of 15300 records with 24 features. (size 15300x24) My output dataset consists of 5 classes (15300x1). I am trying to classify with cnn. When I write the Layer, I encounter the following error:
Caused by:
Layer 2: Input data must have one spatial dimension only, one temporal dimension only, or one of each.
Instead, it has 0 spatial dimensions and 0 temporal dimensions.
I haven't been able to solve it.
  2 Kommentare
Walter Roberson
Walter Roberson am 26 Mär. 2023
Your XTrain is empty, somehow.
nagihan yagmur
nagihan yagmur am 26 Mär. 2023
Bearbeitet: Walter Roberson am 26 Mär. 2023
clc;
clear all;
load veri_seti.mat
X = MyData.Inp;
Y = categorical(MyData.Out);
numClasses = 5;
layers = [ featureInputLayer(24,'Name','inputs')
convolution1dLayer(128,3,'Stride',2)
reluLayer() maxPooling1dLayer(2,'Stride',2)
batchNormalizationLayer()
convolution1dLayer(64,3,'Stride',1)
reluLayer()
maxPooling1dLayer(2,'Stride',2)
batchNormalizationLayer()
dropoutLayer(0.2)
convolution1dLayer(32,3,'Stride',1)
reluLayer()
batchNormalizationLayer()
convolution1dLayer(16,3,'Stride',1)
reluLayer()
batchNormalizationLayer()
dropoutLayer(0.2)
convolution1dLayer(8,3,'Stride',1)
reluLayer()
maxPooling1dLayer(2,'Stride',2)
globalMaxPooling1dLayer()
dropoutLayer(0.2)
batchNormalizationLayer()
fullyConnectedLayer(1024)
fullyConnectedLayer(1024)
softmaxLayer()
classificationLayer()];
% Öğrenme oranı ve diğer hiperparametreler
miniBatchSize = 128;
maxEpochs = 30;
initialLearningRate = 0.001;
learnRateDropFactor = 0.1;
learnRateDropPeriod = 10;
% Options nesnesi
options = trainingOptions('adam', ...
'MiniBatchSize', miniBatchSize, ...
'MaxEpochs', maxEpochs, ...
'InitialLearnRate', initialLearningRate, ...
'LearnRateSchedule', 'piecewise', ...
'LearnRateDropFactor', learnRateDropFactor, ...
'LearnRateDropPeriod', learnRateDropPeriod, ...
'Shuffle', 'every-epoch', ...
'Verbose', false, ...
'Plots', 'training-progress');
Error using vertcat
Dimensions of arrays being concatenated are not consistent.
Error in example (line 9)
layers = [ featureInputLayer(24,'Name','inputs')
I'm constantly between these two errors

Melden Sie sich an, um zu kommentieren.

Akzeptierte Antwort

Matt J
Matt J am 4 Apr. 2023
Bearbeitet: Matt J am 6 Apr. 2023
Tech Support has suggested 2 workarounds to me. The simplest IMO is to recast the training as a 2D image classification problem, where one of the dimensions of the image is a singleton. This requires the use of an imageInputLayer as well as converting convolutional and pooling layers to 2D form, also specifying one of the dimensions as a singleton.
load veri_seti
XTrain = reshape(MyData.Inp',24,1,1,[]); %Dimensions: 24x1x1xBatch
YTrain = reshape( categorical(MyData.Out),[],1); %Dimensions: Batchx1
layers = [ imageInputLayer([24,1],'Name','inputs') %<---Use imageInputLayer
convolution2dLayer([5,1], 32, 'Padding', 'same')
batchNormalizationLayer
reluLayer
maxPooling2dLayer([2,1], 'Stride', [2,1])
convolution2dLayer([5,1], 64, 'Padding', 'same')
batchNormalizationLayer
reluLayer
maxPooling2dLayer([2,1], 'Stride', [2,1])
convolution2dLayer([5,1], 128, 'Padding', 'same')
batchNormalizationLayer
reluLayer
maxPooling2dLayer([2,1], 'Stride', [2,1])
dropoutLayer(0.5)
flattenLayer
fullyConnectedLayer(5)
softmaxLayer
classificationLayer];
%analyzeNetwork(layers);
options = trainingOptions('adam', ...
'MaxEpochs', 3, ...
'MiniBatchSize', 128, ...
'Verbose', false, ...
'Plots', 'training-progress','ExecutionEnvironment','cpu');
net = trainNetwork(XTrain, YTrain(:), layers,options);
  3 Kommentare
nagihan yagmur
nagihan yagmur am 6 Apr. 2023
I am very very grateful
Matt J
Matt J am 6 Apr. 2023
I'm glad, but please Accept-click the answer to indicate that it worked.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (1)

Walter Roberson
Walter Roberson am 26 Mär. 2023
Verschoben: Walter Roberson am 26 Mär. 2023
layers = [ featureInputLayer(24,'Name','inputs')
convolution1dLayer(128,3,'Stride',2)
reluLayer() maxPooling1dLayer(2,'Stride',2)
Notice you have two layers on the same line.
  3 Kommentare
Walter Roberson
Walter Roberson am 27 Mär. 2023
Please show
whos -file veri_seti.mat
whos X Y
nagihan yagmur
nagihan yagmur am 29 Mär. 2023
The dataset is attached.

Melden Sie sich an, um zu kommentieren.

Produkte


Version

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by