Can we only train the classification layer when do transfer learning of a pre-trained network?
11 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Sud Sudirman
am 15 Jul. 2020
Kommentiert: Mathieu
am 2 Jul. 2021
Hi,
My question relates to this article https://uk.mathworks.com/help/deeplearning/gs/get-started-with-transfer-learning.html
The question is, can we only train the classification layer when do transfer learning of a pre-trained network? I want to speed up my training by keeping the feature extraction layers (base model) as they are and only replace and retrain the classification layers.
The equivalent way in Keras (Python) is by: base_model.trainable = False
If possible in Matlab, please let me know how. Your help is appreciated.
Cheers
Sud
2 Kommentare
Greg Heath
am 17 Jul. 2020
See if it is now possible to assign different learning rates to the different layers. I wasn't able to some time ago.
Greg
Akzeptierte Antwort
Srivardhan Gadila
am 17 Jul. 2020
In order to freeze the weights of a particular layer of your network set the properties WeightLearnRateFactor & BiasLearnRateFactor to zero. Refer to fullyconnectedLayer - Learn Rate and Regularization, convolution2dLayer - Learn Rate and Regularization & lstmLayer - Learn Rate and Regularization.
layer.WeightLearnRateFactor = 0;
You can also refer to Freeze Initial Layers of the Train Deep Learning Network to Classify New Images example.
2 Kommentare
Mathieu
am 2 Jul. 2021
Hi,
I'm OK, it work, but the training seems to be relatively slow? I mean, I expected it to be quicker. With your method, is the gradient calculated for all layers?
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Image Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!