How can we save neural network with best validation loss

4 Ansichten (letzte 30 Tage)
Arjun Desai
Arjun Desai am 9 Jul. 2018
Kommentiert: Nethtrick am 22 Sep. 2020
Currently I am using the trainNetwork command to train my network model. I want to save the model with the best running validation loss. For example, let us say at epoch 10, my validation loss is 0.2 and that is the lowest validation loss up to that point, then I would save that network model. Then, we reach epoch 11, where the validation loss reaches 0.1, we would also save this model (i.e. running best validation loss model).
My network contains batchNormalization layers, and as a result, I cannot use the models saved at checkpoints as the batchNormalization layers are not initialized.
Is there a work around for this? I know that tensorflow/Keras supports saving models with the best validation loss that do contain batchNormalization layers.

Antworten (1)

Pablo Rivas
Pablo Rivas am 2 Jun. 2019
I don;t think this is possible yet.
** feature request **
It seems like, for now, you will have to save checkpoints of your network in every epoch, and at the end, in the training summary, you can see which epoch gave you the best validation accuracy/error and go back and find on what file corresponds to the chekpoint for that epoch.
However, this can be space-consuming and not ideal at all. It would be realy nice to have this feature. Right?
  1 Kommentar
Nethtrick
Nethtrick am 22 Sep. 2020
Unfortunately the checkpoint approach does not work with the batch normalization layers. I am running into the same issue. It's an oversight not to have this built in because "training" itself is defined as minimizing the loss function.
I posted this question also before I found your post:

Melden Sie sich an, um zu kommentieren.

Kategorien

Mehr zu Image Data Workflows finden Sie in Help Center und File Exchange

Produkte


Version

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by