MATLAB Answers

How can I predict future values of time series in neural network ?

19 views (last 30 days)
Ugur Can
Ugur Can on 4 Mar 2016
Answered: Abolfazl Nejatian on 23 Nov 2018
I have a time series that has internet traffic rates. 14772 row value and 1 column. I use NARnet at NN Time Series Toolbox and train it with %70 and test with %30 of series. I need the MAPE, so I divided the TargetSeries(Actual Values from .xlsx) to two matrix: TrainSeries(first 10340) and TestSeries(last 4432 value). Calculating the MAPE with TestSeries and last 4432 values of TargetSeries. Finally, I need to predict the future values of time series which I want. For example, I want the NAR predict to 15000th value. But I am confused. I deactivated the close loop and predict section because dont how to use. How to predict the future values ? With Close Loop Network or Step Ahead Prediction Network or both ? I searched the internet and found a code piece by Greg Health. But dont now how to arrange the parameters for my problem(ex. Xf, Af, Xs, Xi etc.) :
[ net tr ] = train( net, Xs, Ts, Xi, Ai );
[ Ys Xf Af ] = net( Xs, Xi, Ai );
Es = gsubtract(net,Ts,Ys);
Finally, to predict into the future M timesteps beyond the end of the target data
Xic2 = Xf;
Aic2 = Af;
Ypred = netc2( cell(1,M), Xic2, Aic2);
%%%%%%%%%%%%%%%% Here is my code :
% Solve an Autoregression Time-Series Problem with a NAR Neural Network
% Script generated by Neural Time Series app
% Created 20-Feb-2016 15:46:59
% Choose a Training Function
trainFcn = 'trainlm'; % Levenberg-Marquardt backpropagation.
% Create a Nonlinear Autoregressive Network
feedbackDelays = 1:17;
hiddenLayerSize = 5;
net = narnet(feedbackDelays,hiddenLayerSize,'open',trainFcn);
% Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer
% states. Using PREPARETS allows you to keep your original time series data
% unchanged, while easily customizing it for networks with differing
% numbers of delays, with open loop or closed loop feedback modes.
% Setup Division of Data for Training, Validation, Testing
net.performParam.normalization ='standard';
%[trainInd,testInd] = divideind(14772,1:10340,10341:14772);
net.divideParam.trainInd = 1:10340;
net.divideParam.valInd = 10340:10340;
net.divideParam.testInd = 10341:14772;
[x,xi,ai,t] = preparets(net,{},{},TargetSeries);
for i=1:10340
for i=1:4431
% Train the Network
[net,tr] = train(net,x,t,xi,ai);
% Test the Network
y = net(TestSeries,xi,ai);
performance = perform(net,TestSeries,y);
% View the Network
%hold on
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
%figure, plotregression(t,y)
%figure, plotresponse(t,y)
%figure, ploterrcorr(e)
%figure, plotinerrcorr(x,e)
% Closed Loop Network
% Use this network to do multi-step prediction.
% The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
% netc = closeloop(net);
% = [ ' - Closed Loop'];
% view(netc)
% [xc,xic,aic,tc] = preparets(netc,{},{},T);
% yc = netc(xc,xic,aic);
% closedLoopPerformance = perform(net,tc,yc)
% Step-Ahead Prediction Network
% For some applications it helps to get the prediction a timestep early.
% The original network returns predicted y(t+1) at the same time it is
% given y(t+1). For some applications such as decision making, it would
% help to have predicted y(t+1) once y(t) is available, but before the
% actual y(t+1) occurs. The network can be made to return its output a
% timestep early by removing one delay so that its minimal tap delay is now
% 0 instead of 1. The new network returns the same outputs as the original
% network, but outputs are shifted left one timestep.
% nets = removedelay(net);
% = [ ' - Predict One Step Ahead'];
% view(nets)
% [xs,xis,ais,ts] = preparets(nets,{},{},TargetSeries);
% ys = nets(xs,xis,ais);
% stepAheadPerformance = perform(nets,ts,ys);

Accepted Answer

Greg Heath
Greg Heath on 4 Mar 2016
0. There is no lower case "L" in Heath
1. Capitals for cells, lower case for doubles
2. OL and 'o' for OpenLoop, CL and 'c' for Closed Loop
narnet 40 165
narnet greg 14 144
narnet tutorial 8 38
4. Apply your code to the example data in help/doc narnet and/or one of the other example datasets in help/doc nndatasets
5. Run the example(s) with all defaults except divideblock before considering your own data with nondefault settings
Hope this helps
Thank you for formally accepting my answer
close all, clear all, clc, plt=0
T = simplenar_dataset;
t = cell2mat(T); [ I N ] = size(t) % [ 1 100 ]
vart1 = var(t,1) % MSE Reference 0.063306
% In general vart1 = mean(var(t',1))
Ntst = round(0.15*N), Nval = Ntst % 15, 15
Ntrn = N-Nval-Ntst % 70
% ASSUME no statistical differences in trn/val/tst
% subsets so that DIVIDEBLOCK can be used
trnind = 1:Ntrn; valind= Ntrn+1:Ntrn+Nval;
tstind = Ntrn+Nval+1:N;
ttrn = t(trnind); tval = t(valind); ttst=t(tstind);
plt = plt+1, figure(plt), hold on
% Plot shows no significant statistical differences
% in trn/val/tst subsets
% In general:
% A. deduce significant positive feedback delay lags,
% FD, from the autocorrelation function of ttrn
% B. For MSEgoal = vart1/200, detemine the smallest
% successful number of hidden nodes, H, by trial and error
% C. For 1st run use defaults except for DIVIDEBLOCK
FD =1:2, H = 10
neto = narnet; neto.divideFcn = 'divideblock';
[ Xo ,Xoi, Aoi, To ] = preparets( neto, {}, {}, T );
to = cell2mat(To); varto1 = var(to,1) %0.061307
[ neto tro Yo Eo Xof Aof] = train( neto , Xo, To, Xoi, Aoi );
% [ Yo = neto(Xo, Xoi, Aoi ); Eo = gsubtract(To,Yo);
NMSEo = mse(Eo)/varto1 % 6.3328e-09
% Use training record tro to isolate predicted future
% nontraining (i.e., val and test) outputs and performance.
% For further predictions, must use the CL configuration.
%BUG WARNING: Division indices and ratios in tro are not
% consistent with those used above and stored in neto.

More Answers (1)

Abolfazl Nejatian
Abolfazl Nejatian on 23 Nov 2018
here is my code,
this piece of code predicts time series data by use of deep learning and shallow learning algorithm.
best wishes
abolfazl nejatian

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by