setLearnRateFactor
Set learn rate factor of layer learnable parameter
Syntax
Description
sets the learn rate factor of the parameter with the name
layerUpdated
= setLearnRateFactor(layer
,parameterName
,factor
)parameterName
in layer
to
factor
.
For built-in layers, you can set the learn rate factor directly by using the
corresponding property. For example, for a convolution2dLayer
layer, the syntax layer =
setLearnRateFactor(layer,'Weights',factor)
is equivalent to
layer.WeightLearnRateFactor = factor
.
sets the learn rate factor of the parameter specified by the path
layerUpdated
= setLearnRateFactor(layer
,parameterPath
,factor
)parameterPath
. Use this syntax when the parameter is in
a dlnetwork
object in a custom layer.
sets the learn rate factor of the parameter with the name
netUpdated
= setLearnRateFactor(net
,layerName
,parameterName
,factor
)parameterName
in the layer with name
layerName
for the specified dlnetwork
object.
sets the learn rate factor of the parameter specified by the path
netUpdated
= setLearnRateFactor(net
,parameterPath
,factor
)parameterPath
. Use this syntax when the parameter is in
a nested layer.
Examples
Set and Get Learning Rate Factor of Learnable Parameter
Set and get the learning rate factor of a learnable parameter of a custom PReLU layer.
Create a layer array containing the custom layer preluLayer
, attached to this is example as a supporting file. To access this layer, open this example as a live script.
layers = [ ...
imageInputLayer([28 28 1])
convolution2dLayer(5,20)
batchNormalizationLayer
preluLayer
fullyConnectedLayer(10)
softmaxLayer
classificationLayer];
Set the learn rate factor of the Alpha
learnable parameter of the preluLayer
to 2.
layers(4) = setLearnRateFactor(layers(4),"Alpha",2);
View the updated learn rate factor.
factor = getLearnRateFactor(layers(4),"Alpha")
factor = 2
Set and Get Learning Rate Factor of Nested Layer Learnable Parameter
Set and get the learning rate factor of a learnable parameter of a nested layer.
Create a residual block layer using the custom layer residualBlockLayer
attached to this example as a supporting file. To access this file, open this example as a Live Script.
numFilters = 64; layer = residualBlockLayer(numFilters)
layer = residualBlockLayer with properties: Name: '' Learnable Parameters Network: [1x1 dlnetwork] State Parameters Network: [1x1 dlnetwork] Use properties method to see a list of all properties.
View the layers of the nested network.
layer.Network.Layers
ans = 7x1 Layer array with layers: 1 'conv_1' 2-D Convolution 64 3x3 convolutions with stride [1 1] and padding 'same' 2 'batchnorm_1' Batch Normalization Batch normalization 3 'relu_1' ReLU ReLU 4 'conv_2' 2-D Convolution 64 3x3 convolutions with stride [1 1] and padding 'same' 5 'batchnorm_2' Batch Normalization Batch normalization 6 'add' Addition Element-wise addition of 2 inputs 7 'relu_2' ReLU ReLU
Set the learning rate factor of the learnable parameter 'Weights'
of the layer 'conv_1'
to 2 using the setLearnRateFactor
function.
factor = 2;
layer = setLearnRateFactor(layer,'Network/conv_1/Weights',factor);
Get the updated learning rate factor using the getLearnRateFactor
function.
factor = getLearnRateFactor(layer,'Network/conv_1/Weights')
factor = 2
Set and Get Learn Rate Factor of dlnetwork
Learnable Parameter
Set and get the learning rate factor of a learnable parameter of a dlnetwork
object.
Create a dlnetwork
object.
layers = [ imageInputLayer([28 28 1],'Normalization','none','Name','in') convolution2dLayer(5,20,'Name','conv') batchNormalizationLayer('Name','bn') reluLayer('Name','relu') fullyConnectedLayer(10,'Name','fc') softmaxLayer('Name','sm')]; lgraph = layerGraph(layers); dlnet = dlnetwork(lgraph);
Set the learn rate factor of the 'Weights'
learnable parameter of the convolution layer to 2 using the setLearnRateFactor
function.
factor = 2; dlnet = setLearnRateFactor(dlnet,'conv','Weights',factor);
Get the updated learn rate factor using the getLearnRateFactor
function.
factor = getLearnRateFactor(dlnet,'conv','Weights')
factor = 2
Set and Get Learning Rate Factor of Nested dlnetwork
Learnable Parameter
Set and get the learning rate factor of a learnable parameter of a nested layer in a dlnetwork
object.
Create a dlnetwork
object containing the custom layer residualBlockLayer
attached to this example as a supporting file. To access this file, open this example as a Live Script.
inputSize = [224 224 3]; numFilters = 32; numClasses = 5; layers = [ imageInputLayer(inputSize,'Normalization','none','Name','in') convolution2dLayer(7,numFilters,'Stride',2,'Padding','same','Name','conv') groupNormalizationLayer('all-channels','Name','gn') reluLayer('Name','relu') maxPooling2dLayer(3,'Stride',2,'Name','max') residualBlockLayer(numFilters,'Name','res1') residualBlockLayer(numFilters,'Name','res2') residualBlockLayer(2*numFilters,'Stride',2,'IncludeSkipConvolution',true,'Name','res3') residualBlockLayer(2*numFilters,'Name','res4') residualBlockLayer(4*numFilters,'Stride',2,'IncludeSkipConvolution',true,'Name','res5') residualBlockLayer(4*numFilters,'Name','res6') globalAveragePooling2dLayer('Name','gap') fullyConnectedLayer(numClasses,'Name','fc') softmaxLayer('Name','sm')]; dlnet = dlnetwork(layers);
View the layers of the nested network in the layer 'res1'
.
dlnet.Layers(6).Network.Layers
ans = 7x1 Layer array with layers: 1 'conv_1' 2-D Convolution 32 3x3x32 convolutions with stride [1 1] and padding 'same' 2 'batchnorm_1' Batch Normalization Batch normalization with 32 channels 3 'relu_1' ReLU ReLU 4 'conv_2' 2-D Convolution 32 3x3x32 convolutions with stride [1 1] and padding 'same' 5 'batchnorm_2' Batch Normalization Batch normalization with 32 channels 6 'add' Addition Element-wise addition of 2 inputs 7 'relu_2' ReLU ReLU
Set the learning rate factor of the learnable parameter 'Weights'
of the layer 'conv_1'
to 2 using the setLearnRateFactor
function.
factor = 2;
dlnet = setLearnRateFactor(dlnet,'res1/Network/conv_1/Weights',factor);
Get the updated learning rate factor using the getLearnRateFactor
function.
factor = getLearnRateFactor(dlnet,'res1/Network/conv_1/Weights')
factor = 2
Freeze Learnable Parameters of dlnetwork
Object
Load a pretrained network.
net = squeezenet;
Convert the network to a layer graph, remove the output layer, and convert it to a dlnetwork
object.
lgraph = layerGraph(net);
lgraph = removeLayers(lgraph,'ClassificationLayer_predictions');
dlnet = dlnetwork(lgraph);
The Learnables
property of the dlnetwork
object is a table that contains the learnable parameters of the network. The table includes parameters of nested layers in separate rows. View the first few rows of the learnables table.
learnables = dlnet.Learnables; head(learnables)
Layer Parameter Value __________________ _________ ___________________ "conv1" "Weights" {3x3x3x64 dlarray} "conv1" "Bias" {1x1x64 dlarray} "fire2-squeeze1x1" "Weights" {1x1x64x16 dlarray} "fire2-squeeze1x1" "Bias" {1x1x16 dlarray} "fire2-expand1x1" "Weights" {1x1x16x64 dlarray} "fire2-expand1x1" "Bias" {1x1x64 dlarray} "fire2-expand3x3" "Weights" {3x3x16x64 dlarray} "fire2-expand3x3" "Bias" {1x1x64 dlarray}
To freeze the learnable parameters of the network, loop over the learnable parameters and set the learn rate to 0 using the setLearnRateFactor
function.
factor = 0; numLearnables = size(learnables,1); for i = 1:numLearnables layerName = learnables.Layer(i); parameterName = learnables.Parameter(i); dlnet = setLearnRateFactor(dlnet,layerName,parameterName,factor); end
To use the updated learn rate factors when training, you must pass the dlnetwork object to the update function in the custom training loop. For example, use the command
[dlnet,velocity] = sgdmupdate(dlnet,gradients,velocity);
Input Arguments
layer
— Input layer
scalar Layer
object
Input layer, specified as a scalar Layer
object.
parameterName
— Parameter name
character vector | string scalar
Parameter name, specified as a character vector or a string scalar.
factor
— Learning rate factor
nonnegative scalar
Learning rate factor for the parameter, specified as a nonnegative scalar.
The software multiplies this factor by the global learning rate to
determine the learning rate for the specified parameter. For example, if
factor
is 2, then the learning rate for the
specified parameter is twice the current global learning rate. The software
determines the global learning rate based on the settings specified with the
trainingOptions
function.
Example:
2
parameterPath
— Path to parameter in nested layer
string scalar | character vector
Path to parameter in nested layer, specified as a string scalar or a character vector. A nested layer is a custom layer that itself defines a layer graph as a learnable parameter.
If the input to setLearnRateFactor
is a nested layer, then the parameter
path has the form "propertyName/layerName/parameterName"
, where:
propertyName
is the name of the property containing adlnetwork
objectlayerName
is the name of the layer in thedlnetwork
objectparameterName
is the name of the parameter
If there are multiple levels of nested layers, then specify each level using the form
"propertyName1/layerName1/.../propertyNameN/layerNameN/parameterName"
,
where propertyName1
and layerName1
correspond to
the layer in the input to the setLearnRateFactor
function, and the
subsequent parts correspond to the deeper levels.
Example: For layer input to setLearnRateFactor
, the path
"Network/conv1/Weights"
specifies the
"Weights"
parameter of the layer with name
"conv1"
in the dlnetwork
object given by
layer.Network
.
If the input to setLearnRateFactor
is a dlnetwork
object and the desired parameter is in a nested layer, then the parameter path has the
form "layerName1/propertyName/layerName/parameterName"
, where:
layerName1
is the name of the layer in the inputdlnetwork
objectpropertyName
is the property of the layer containing adlnetwork
objectlayerName
is the name of the layer in thedlnetwork
objectparameterName
is the name of the parameter
If there are multiple levels of nested layers, then specify each level using the form
"layerName1/propertyName1/.../layerNameN/propertyNameN/layerName/parameterName"
,
where layerName1
and propertyName1
correspond to
the layer in the input to the setLearnRateFactor
function, and the
subsequent parts correspond to the deeper levels.
Example: For dlnetwork
input to setLearnRateFactor
, the
path "res1/Network/conv1/Weights"
specifies the
"Weights"
parameter of the layer with name
"conv1"
in the dlnetwork
object given by
layer.Network
, where layer
is the layer with
name "res1"
in the input network
net
.
Data Types: char
| string
net
— Neural network
dlnetwork
object
Neural network, specified as a dlnetwork
object.
layerName
— Layer name
string scalar | character vector
Layer name, specified as a string scalar or a character vector.
Data Types: char
| string
Output Arguments
layerUpdated
— Updated layer
Layer
object
Updated layer, returned as a Layer
.
netUpdated
— Updated network
dlnetwork
object
Updated network, returned as a dlnetwork
.
Version History
Introduced in R2017b
Beispiel öffnen
Sie haben eine geänderte Version dieses Beispiels. Möchten Sie dieses Beispiel mit Ihren Änderungen öffnen?
MATLAB-Befehl
Sie haben auf einen Link geklickt, der diesem MATLAB-Befehl entspricht:
Führen Sie den Befehl durch Eingabe in das MATLAB-Befehlsfenster aus. Webbrowser unterstützen keine MATLAB-Befehle.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list:
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)