Multivariate Regression (in time and features) Using LSTM

2 Ansichten (letzte 30 Tage)
JORGE FILHO
JORGE FILHO am 29 Jul. 2021
Bearbeitet: JORGE FILHO am 29 Jul. 2021
Trying to feed a LSTM with different streamflow time series and their delayed sequences for gap filling. Let x be the initial matrix with selected predictors, one per line, considering size(x,2) as the number of samples. To introduce time dependence, the predictors are alternated with their delayed versions (from dt= [1:ndt], ndt being the maximum delay considered) as below:
for ii=1:size(x,2)
for j=1:ndt
x1(j:end,ndt*(ii-1)+j)=x(1:end-j+1,ii);
end
end
with the respective LSTM:
numFeatures = size(xTrain,1);
numResponses = size(yTrain,1);
numHiddenUnits = 300;
layers = [ ...
sequenceInputLayer(numFeatures)
lstmLayer(numHiddenUnits)
fullyConnectedLayer(numResponses)
regressionLayer];
The target is a line vector y. Is there a more effective arrange to introduce time dependencies in LSTM? I mean, I have tried to associate every y instance with a 3D matrix x2 containning the values of x (not of x1) from (t-ndt) to (t):
for ii=ndt:size(x,1)
x2(:,:,ii)=x(ii-ndt+1:ii,:);
end
But I don't know how to addapt the respectve LSTM.
I know the "Sequence-to-Sequence Using Deep-Learning example
I does not include explicit time dependencies.
Thanks.

Antworten (0)

Kategorien

Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by