Forward function with frozen batch normalization layers
Ältere Kommentare anzeigen
In my application i have both batch normalization and dropout, and i would like to perform MC dropout with the forward function, and ideally i would freeze the parameters TrainedMean and TrainedVariance for the batch normalization layers, but i cannot seem to understand is it possible. I have the bn layers after conv layers, and the dropout after the recurrent layer in my net. Thank you in advance
1 Kommentar
Imola Fodor
am 28 Feb. 2024
Akzeptierte Antwort
Weitere Antworten (0)
Kategorien
Mehr zu Deep Learning Toolbox finden Sie in Hilfe-Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
