Adding Dropout to narxnet
7 Ansichten (letzte 30 Tage)
Andriy Artemyev am 23 Mär. 2021
Greetings! I wanted to ask if/how it is possible to add a dropout layer to a narxnet to improve regularization. Unfortunately I could not find any information elsewhere.
I Have a narxnet that used the last 3 lags of a timeseries and an exogenous input to forecast the next timestep and I would like to introduce regularization measures to help with overfitting. Thanks in advance! My current code looks as follow:
forecast_horizon = 1;
neurons = [5 5];
delays = 3;
inputDelays = (1:delays);
feedbackDelays = (1:delays);
net=narxnet(inputDelays, feedbackDelays, neurons);
net.trainParam.epochs = 40;
net = removedelay(net,forecast_horizon);
Shashank Gupta am 29 Mär. 2021
There are some workaround to add a dropout in narxnet, you can add the dropout by defining a custom transfer function to one of the layer. Details on how to create a custom transfer function is show here in the link.
Another convenient way is to not use a shallow network but go for deep networks. There are some resources you can check, try this, once you implement this, you can simply add a dropout layer. I will prefer deep network way. It's more easy, reliable and conveninet to implement.
I hope this helps.