How to force overfiting of Deep Learning Network for Classification

1 Ansicht (letzte 30 Tage)
Wojciech Czop
Wojciech Czop am 13 Jan. 2020
Kommentiert: Greg Heath am 18 Jan. 2020
How to force overfiting neural network proposed at documentation https://www.mathworks.com/help/deeplearning/examples/create-simple-deep-learning-network-for-classification.html trained on MNIST dataset ?

Antworten (2)

Srivardhan Gadila
Srivardhan Gadila am 17 Jan. 2020
As your question is specific to overfitting the proposed network in the example "Create Simple Deep Learning Network for Classification" , I can suggest you the following:
First one:
While splitting the dataset for training & validation, do not split them randomly. Instead do it normally as follows:
[imdsTrain,imdsValidation] = splitEachLabel(imds,numTrainFiles); %remove the input argument 'randomize'
Then train the network until the training loss/accuracy saturates
Second one:
Take only 10% of the original dataset provided in the example. Train on 75% of the new dataset & validate on the other 25%. You can then see that the network will overfit as the network is too big for the new dataset and over the epochs it will overfit.
The following code can help you for getting 10% of the original dataset
digitDatasetPath = fullfile(matlabroot,'toolbox','nnet','nndemos', ...
'nndatasets','DigitDataset');
imds = imageDatastore(digitDatasetPath, ...
'IncludeSubfolders',true,'LabelSource','foldernames');
%display count of number of samples per each label
imds.countEachLabel
numFiles = 100;
%Taking random 10% of the original dataset with equal samples for each cateogry
imds = splitEachLabel(imds,numFiles,'randomize');
%display count of number of samples per each label after taking 10% of samples from original set
imds.countEachLabel
numTrainFiles = 75;
[imdsTrain,imdsValidation] = splitEachLabel(imds,numTrainFiles,'randomize');
You can also change the default trainingOptions too like 'momentum', 'L2Regularization' etc.
You can also refer to Improve Shallow Neural Network Generalization and Avoid Overfitting and Questions related to overfitting in MATLAB Answers Community.

Greg Heath
Greg Heath am 18 Jan. 2020
OVERFITTING = More training unknowns (e.g., weights) than training vectors.
OVERTRAINING1 = Training an overfit network to or past convergence (DANGEROUS)
OVERTRAINING2 = Training any network past convergence (STUPID BUT NOT NECESSARILY DANGEROUS)
HOPE THIS HELPS
THANK YOU FOR FORMALLY ACCEPTING MY ANSWER
GREG

Kategorien

Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by