what do pruningActivations and pruningGradients refer to?
1 Ansicht (letzte 30 Tage)
Ältere Kommentare anzeigen
muhamed ibrahim
am 19 Apr. 2025
Bearbeitet: Shantanu Dixit
am 21 Apr. 2025
In pruning using Taylor scores example, I can't understand what pruningActivations and pruningGradients refer to
[loss,pruningActivations, pruningGradients, netGradients, state] = ...
dlfeval(@modelLossPruning, prunableNet, X, T);
4 Kommentare
Matt J
am 19 Apr. 2025
Bearbeitet: Matt J
am 19 Apr. 2025
No, it's not an answer to your question. It's to tell you that no human would be able to understand the context of your question from just one line of code taken from somewhere in the documentation, so maybe you thought you were on a GPT engine. If you are looking for an AI interface, you can find one here,
Otherwise, providing a link to the full code and example would be a good start.
Walter Roberson
am 19 Apr. 2025
The discussion appears to be related to https://www.mathworks.com/help/deeplearning/ref/deep.prune.taylorprunablenetwork.updateprunables.html
Akzeptierte Antwort
Shantanu Dixit
am 21 Apr. 2025
Bearbeitet: Shantanu Dixit
am 21 Apr. 2025
Hi Muhamed, If I correctly understood the query you're referring to the example: https://www.mathworks.com/help/deeplearning/ug/prune-image-classification-network-using-taylor-scores.html which details on pruning networks for resource efficient inferencing.
In this context 'pruningActivations' refers to the outputs of the layers designated as prunable (ex: in convolution layers, each activation corresponds to the output of a filter), similarly 'pruningGradients' refers to the gradients of the loss with respect to the 'pruningActivations' (measures the sensitivity: larger the gradient, greater the impact on loss).
Correspondingly, 'Taylor Score' is computed as the element-wise product of 'pruningActivations' and 'pruningGradients' using 'updateScore'. Higher scores imply the activation and associated parameters is more critical (for more information you can refer to References Section: Article 2, Section 2.2 which details on 'criteria for pruning').
% Compute first-order Taylor scores and accumulate the score across
% previous mini-batches of data.
prunableNet = updateScore(prunableNet, pruningActivations, pruningGradients);
You can refer to the below pruning network related references for more information:
Pruning network using Taylor Scores: https://in.mathworks.com/help/deeplearning/ug/prune-image-classification-network-using-taylor-scores.html#PruningUsingTaylorPrunableNetworkExample-3
updateScore: https://www.mathworks.com/help/deeplearning/ref/deep.prune.taylorprunablenetwork.updatescore.html
Pruning CNNs for resource efficient inference: Molchanov, Pavlo, Stephen Tyree, Tero Karras, Timo Aila, and Jan Kautz. “Pruning Convolutional Neural Networks for Resource Efficient Inference.” Preprint, submitted June 8, 2017. https://arxiv.org/abs/1611.06440.
Hope this helps.
0 Kommentare
Weitere Antworten (0)
Siehe auch
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!