How to add more datasets into pre-trained network in the deep learning process
2 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Sungkun
am 6 Jan. 2020
Bearbeitet: Sungkun
am 2 Jul. 2020
I have used 40,000 input images to train a network in deep learning but want to add more datasets to escalate its accuracy.
Do you have any good suggestions? Matlab recommends "trainNetwork" for deep learning. However, unfortunately, it seems the function is not eligible to use previous network.
0 Kommentare
Akzeptierte Antwort
Delprat Sebastien
am 6 Jan. 2020
Bearbeitet: Delprat Sebastien
am 6 Jan. 2020
I'm not sure this is possible, what ever the Deep-learning framework you are using: the network has been trained to classify image from a dataset A. So. Basically it has learned how to map the images to the label for this specific dataset. If now, you continue to train this network using only a dataset B,then.the training will progressively make the network forget about A.
The only solution I see would be to continue the training on a composite dataset (comprising both A and B) so the initial classification over A is preserved.
Look at the example about transfert learning. This is basically what you try to do. You have to modify the code. To: 1) build an imagedatastore with both dataset 2) keep the original network in place. Maybe rework the final layer to matches the new number of classes(if any new classes in the dataset B) 3) you have to decide if you would like to train the whole network or just the final layer. If you have the same number of classes, then obviously you will keep the same learning rate for all the layers. If you have new classes, then the final layers have been changed, so you will probably need to train the last laye with a higherlearning rate for a few epoch.
0 Kommentare
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Image Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!