CNN+LSTM for EEG classification, training accuracy not getting better than 50 in 2 classes?

7 Ansichten (letzte 30 Tage)
Hello everyone
I am trying to use CNN LSTM networks for classifying healthy EEG data from disease one, I am using a signal with 1266 length and 3 channels for this purpose. but my training accuracy is always 50 and doesn't get better. here is my code and training progress plot.thanks in advance for helps.
numFeatures = 3;
numHiddenUnits1 = 100;
numHiddenUnits2 = 150;
numHiddenUnits3 = 200;
numClasses = 2;
numFeatures = 3;
numClasses = 2;
filterSize = 3;
numFilters = 8;
layers = [ ...
sequenceInputLayer(3,Normalization="zerocenter")
% sequenceFoldingLayer('Name','fold')
convolution1dLayer(filterSize,numFilters,Padding="same")
reluLayer
convolution1dLayer(filterSize,2*numFilters,Padding="same")
reluLayer
convolution1dLayer(filterSize,4*numFilters,Padding="same")
reluLayer
convolution1dLayer(filterSize,8*numFilters,Padding="same")
reluLayer
convolution1dLayer(filterSize,8*numFilters,Padding="same")
reluLayer
globalAveragePooling1dLayer
% sequenceUnfoldingLayer('Name','unfold')
flattenLayer('Name','flatten')
bilstmLayer(numHiddenUnits1,'OutputMode','sequence')
layerNormalizationLayer
% dropoutLayer(0.6)
bilstmLayer(numHiddenUnits2,'OutputMode','sequence')
layerNormalizationLayer
% dropoutLayer(0.6)
bilstmLayer(numHiddenUnits2,'OutputMode','sequence')
layerNormalizationLayer
bilstmLayer(numHiddenUnits3,'OutputMode','last')
layerNormalizationLayer
dropoutLayer(0.6)
fullyConnectedLayer(numClasses)
softmaxLayer
classificationLayer];
lgraph = layerGraph(layers);
maxEpochs = 250;
miniBatchSize = 30;
options = trainingOptions('adam', ...
'ExecutionEnvironment','gpu', ...
'MaxEpochs',maxEpochs, ...
'InitialLearnRate',0.01,...
'LearnRateSchedule','piecewise', ...
'LearnRateDropFactor',0.1, ...
'LearnRateDropPeriod',30, ...
'MiniBatchSize',miniBatchSize, ...
'GradientThreshold',1, ...
'ValidationData',{zschannel2delay6test,ytest2},...
'shuffle','every-epoch',...
'ValidationFrequency',50,...
'ValidationPatience',Inf,...
'Verbose',false, ...
'Plots','training-progress');
net = trainNetwork(zschannel2delay6,ytrain2,lgraph,options);
  1 Kommentar
Farnaz Garehdaghi
Farnaz Garehdaghi am 5 Sep. 2022
and I have 440 samples for train and 160 samples for test. My train data is [440 1] cell and every row is [3 1266] and my labels matrix is a categorical [440 1] matrix.

Melden Sie sich an, um zu kommentieren.

Antworten (1)

Gayathri
Gayathri am 10 Jan. 2025 um 9:03
There are some points which you can try to improve the accuracy of the model.
  • A learning rate of 0.01 might be too high. Try reducing it to 0.001 or even lower to see if the network starts to learn better.
  • Your network might be too complex for the given task. Try reducing the number of layers or hidden units in the LSTM layers.
  • Consider augmenting your data to provide more training examples.
  • Try training the model on a small subset of the data to see if it can overfit. If it can, the model architecture might be appropriate, and the issue could be with hyperparameters or the data.
  • Experiment with different batch sizes. Smaller batch sizes can sometimes help the model converge better.
Hope this helps!

Kategorien

Mehr zu EEG/MEG/ECoG finden Sie in Help Center und File Exchange

Produkte


Version

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by