LSTM open loop testing problem
3 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Gonzalo Postigo Omeñaca
am 9 Apr. 2022
Beantwortet: Akshat
am 20 Okt. 2023
I have designed a LSTM network for electricity price forecasting. It has 14 input variables and one output (day ahead electricity price). Once I have trained the network, I want to test it with an open loop. My idea is to update the network with real output values "YTest" and save the predicted results in "Y". Obviously, prediction is done with "XTest" values, which are the inputs.
The example given by MathWorks (https://es.mathworks.com/help/deeplearning/ug/time-series-forecasting-using-deep-learning.html) does a prediciton with 1 input and 1 output. As it is a time series, the input is the delayed (t-1) value of the output array. Therefore, when it comes to updating the network "predictandUpdateState" is used. In my case, I want to upddate it with the real values which are saved in the array "YTest".
This is the code that I have tried to use, but the lines which contain [net]=UpdateState are not correct. I attach the used data.
Thank you in advance.
CODE:
XTrain=INPUTSP5LSTM;
YTrain=OUTPUTSP1TRAIN;
XTest=INPUTSP5LSTMTEST;
YTest=OUTPUTSP1TEST;
layers = [
sequenceInputLayer(14,"Name","sequence")
lstmLayer(128,"Name","lstm_1")
lstmLayer(128,"Name","lstm_2")
fullyConnectedLayer(1,"Name","fc")
regressionLayer("Name","regressionoutput")];
options = trainingOptions("adam", ...
'MaxEpochs',100, ...
'GradientThreshold',1,...
'InitialLearnRate',0.005,...
'LearnRateSchedule','piecewise',...
'LearnRateDropPeriod',125,...
'LearnRateDropFactor',0.2,...
'Verbose',0,...
'Plots','training-progress');
net = resetState(net);
offset = 50;
[net] = UpdateState(YTest(:,1:offset));
numTimeSteps = 687;
numPredictionTimeSteps = numTimeSteps - offset;
Y = zero(numPredictionTimeSteps);
for t = 1:numPredictionTimeSteps
Y(:,t) = predict(net,XTest(:,offset+t));
[net]=UpdateState(YTest(:,offset+t));
end
0 Kommentare
Akzeptierte Antwort
Akshat
am 20 Okt. 2023
Hi Gonzalo,
As per my understanding of the question, you are facing trouble in updating the model based on the outputs you get.
The function "UpdateState" you mentioned may not be suitable for achieving your specific objective. There is a function called “predictAndUpdateState” which serves the exact purpose you are trying to serve.
Refer to the documentation here for more information on the “predictAndUpdateState” function:
I had put a line in the code (line 23) to check if the network and data have any dimensionality issues, but I have commented it now and the rest of the code is running.
I have reproduced the updated code on my end and you can find the code attached below.
XTrain = INPUTSP5LSTM;
YTrain = OUTPUTSP1TRAIN;
XTest = INPUTSP5LSTMTEST;
YTest = OUTPUTSP1TEST;
layers = [
sequenceInputLayer(14,"Name","sequence")
lstmLayer(128,"Name","lstm_1")
lstmLayer(128,"Name","lstm_2")
fullyConnectedLayer(1,"Name","fc")
regressionLayer("Name","regressionoutput")];
options = trainingOptions("adam", ...
'MaxEpochs', 100, ...
'GradientThreshold', 1, ...
'InitialLearnRate', 0.005, ...
'LearnRateSchedule', 'piecewise', ...
'LearnRateDropPeriod', 125, ...
'LearnRateDropFactor', 0.2, ...
'Verbose', 0, ...
'Plots', 'training-progress');
% net = trainNetwork(XTrain, YTrain, layers, options);
% You can uncomment the above line to try training the network, I did this
% to ensure that the network and dataset are not having any dimensionality
% issues.
offset = 50;
numTimeSteps = 687;
numPredictionTimeSteps = numTimeSteps - offset;
Y = zeros(numPredictionTimeSteps, 1);
% Reset the network state
net = resetState(net);
for t = 1:numPredictionTimeSteps
% Predict using the updated state
[net,Y(:, t)] = predictAndUpdateState(net, XTest(:, offset+t));
end
Hope this helps!
Regards
Akshat Wadhwa
0 Kommentare
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!