Filter löschen
Filter löschen

Trouble training sequential data with neural network

1 Ansicht (letzte 30 Tage)
Pavlos Triantaris
Pavlos Triantaris am 10 Nov. 2017
Bearbeitet: Pavlos Triantaris am 10 Nov. 2017
I am trying to get started with neural networks using own data, so my initial attempt has been to try and classify communication signals of 3 different bandwidth values using their periodograms. A typical case (one periodogram from each class) looks like this:
So it is pretty straightforward. If the human eye can tell them apart, so can a NN, right?
I see that the only way to classify sequential data is via an LSTM network, so convolutional ones are out of the question. It is worth mentioning that the dataset consists of 18700 such sequences, each with a size of 1x12000.
Following the example from the documentation, I define my LSTM network as such:
inputSize = 1;
outputSize = 75;
outputMode = 'last';
numClasses = 3;
layers = [ ...
sequenceInputLayer(inputSize)
lstmLayer(outputSize,'OutputMode',outputMode)
reluLayer()
fullyConnectedLayer(numClasses)
softmaxLayer
classificationLayer]
options = trainingOptions('sgdm',...
'InitialLearnRate', 0.03,...
'LearnRateSchedule','piecewise',...
'LearnRateDropFactor',0.2,...
'LearnRateDropPeriod',5,...
'MaxEpochs',30,...
'MiniBatchSize',64,...
'Plots','training-progress')
Then I proceed to let MATLAB train the network. The results are not even worth commenting upon:
So no significant improvement over 11 whole epochs, performance always worse than random classifier. What should I do in this case, then? Do I need to define different parameters? A different network altogether? Different formatting of data? What might be the solution to this seemingly simple problem?

Antworten (0)

Kategorien

Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by