Error using trainNetwork: asking for same sequence length.
1 Ansicht (letzte 30 Tage)
Ältere Kommentare anzeigen
For some reason, trainNetwork funciton is not working while pulling out this error:
Error using trainNetwork
Invalid training data. Sequence responses must have the same sequence
length as the corresponding predictors.
This is my current code for the Training:
%% Network
layers = [
sequenceInputLayer(1,"Name","input")
lstmLayer(128,"Name","lstm")
dropoutLayer(0.5,"Name","drop")
fullyConnectedLayer(2,"Name","fc")
softmaxLayer("Name","softmax")
classificationLayer("Name","classification")];
miniBatchSize = 27;
options = trainingOptions('adam', ...
'ExecutionEnvironment','cpu', ...
'MaxEpochs',250, ...
'MiniBatchSize',miniBatchSize, ...
'ValidationData',{XValidation,YValidation}, ...
'GradientThreshold',2, ...
'Shuffle','every-epoch', ...
'Verbose',false, ...
'Plots','training-progress');
net = trainNetwork(XTrain,YTrain,layers,options);
and the Values inside the XTrain, YTrain, XValidation, and YValidation is,
XTrain:
YTrain:
I tried converting YTrain into the non-cell categorical variables, but it pulled out the error, so I had to use this format.
XValidation and YValidation is also in same format like above. What is the problem?
I don't see any problem with the length of sequence.
The total summary of the values are like this,
0 Kommentare
Antworten (1)
Ranjeet
am 8 Jun. 2023
Hi Andrew,
It seems you are providing wrong input sequence layer length while initializing the following layer
sequenceInputLayer(1, "Name", "input");
XTrain input is a sequence of length 150000 as shown in the provided figure. Try changing the sequence input layer initialization with
sequenceInputLayer(150000, "Name", "input");
Also, refer to the following resource dealing with same issue:
0 Kommentare
Siehe auch
Kategorien
Mehr zu Pretrained Networks from External Platforms finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!