how to deal with out of memory error when training a large data set or large neural network narx

2 Ansichten (letzte 30 Tage)
how to deal with out of memory error when training a large data set or large neural network stacked autoencoder ......just initiated to analyze deep neural network with time series predictions , what i guess is by using batch trainings but can any one help me in detail

Antworten (2)

Greg Heath
Greg Heath am 15 Apr. 2015
What are
1. input and target
2. size(input), size(target)
3. Significant lags of
a. target autocorrelation function
b. input/target cross correlation function

SAM
SAM am 15 Apr. 2015
Bearbeitet: SAM am 15 Apr. 2015
i am xtremly sorry i wrote narx but i actually tried it with stacked autoencoder . hidden layers [100 150] even on HPC server i was getting same out of memory error.
inputs are 4 *2795 and output 1*2795 .

Kategorien

Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by