checkLayer
Check validity of custom or function layer
Syntax
Description
checkLayer(
checks the validity of a layer using the specified layer
,layout1,...,layoutN
)networkDataLayout
objects, where N
is the number of
layer inputs and layoutK
corresponds to the input
layer.InputNames(K)
. (since R2023b)
checkLayer(
checks the validity of a custom or function layer using generated data of the sizes
in layer
,validInputSize
)validInputSize
. For layers with a single input, set
validInputSize
to a typical size of input data to the
layer. For layers with multiple inputs, set validInputSize
to a
cell array of typical sizes, where each element corresponds to a layer input. This
syntax does not support layers that inherit from the
nnet.layer.Formattable
class.
checkLayer(___,
specifies additional options using one or more name-value arguments.Name=Value
)
Examples
Check Validity of Custom Flatten Layer
Create a function layer object that applies the flatten
function to the layer input. The flatten
function is defined at the end of this example and collapses the spatial dimensions of the input dlarray
into the channel dimension.
customFlattenLayer = functionLayer(@(X) flatten(X),Formattable=true)
customFlattenLayer = FunctionLayer with properties: Name: '' PredictFcn: @(X)flatten(X) Formattable: 1 Acceleratable: 0 Learnable Parameters No properties. State Parameters No properties. Use properties method to see a list of all properties.
Specify the size and dimensions of the inputs to the layer using networkDataLayout
objects.
layout = networkDataLayout([227 227 3 NaN],"SSCB")
layout = networkDataLayout with properties: Size: [227 227 3 NaN] Format: 'SSCB'
Check that the layer is valid using the checkLayer
function.
checkLayer(customFlattenLayer,layout)
Skipping initialization tests. The layer does not have an initialize function. Skipping GPU tests. No compatible GPU device found. Skipping code generation compatibility tests. To check validity of the layer for code generation, specify the CheckCodegenCompatibility and ObservationDimension options. Running nnet.checklayer.TestLayerWithoutBackward .......... ........ Done nnet.checklayer.TestLayerWithoutBackward __________ Test Summary: 18 Passed, 0 Failed, 0 Incomplete, 16 Skipped. Time elapsed: 0.44429 seconds.
In this case, the function does not detect any issues with the layer.
Flatten Function
The flatten
function receives a formatted dlarray
as input and collapses the spatial dimensions of the input dlarray
into the channel dimension. The input dlarray
must not contain time ("T"
) or unspecified ("U"
) dimensions.
function Y = flatten(X) % Find spatial, channel, and batch dimensions. idxS = finddim(X,"S"); idxC = finddim(X,"C"); idxB = finddim(X,"B"); % Determine size of spatial and channel dimensions. sizeS = size(X,idxS); sizeC = size(X,idxC); if ~isempty(idxB) % If the input has a batch dimension, determine the size of the output % channel dimension. numChannels = sizeC*prod(sizeS,"all"); sizeB = size(X,idxB); % Reshape and format output in "CB" format. X = reshape(X,[numChannels sizeB]); Y = dlarray(X,"CB"); else % If the input does not have a batch dimension, reshape and output in % "CU" format. X = X(:); Y = dlarray(X,"CU"); end end
Check Custom Layer Validity
Check the validity of the example custom layer sreluLayer
.
The custom layer sreluLayer
, attached to this example as a supporting file, applies the SReLU operation to the input data. To access this layer, open this example as a live script.
Create an instance of the layer.
layer = sreluLayer;
Create a networkDataLayout
object that specifies the expected input size and format of a single observation of typical input to the layer. Specify a valid input size of [24 24 20 128]
, where the dimensions correspond to the height, width, number of channels, and number of observations of the previous layer output. Specify the data has format "SSCB"
(spatial, spatial, channel, batch).
validInputSize = [24 24 20 128];
layout = networkDataLayout(validInputSize,"SSCB");
Check the layer validity using checkLayer
. When you pass data through the network, the layer expects 4-D array inputs, where the first three dimensions correspond to the height, width, and number of channels of the previous layer output, and the fourth dimension corresponds to the observations.
checkLayer(layer,layout)
Skipping GPU tests. No compatible GPU device found. Skipping code generation compatibility tests. To check validity of the layer for code generation, specify the CheckCodegenCompatibility and ObservationDimension options. Running nnet.checklayer.TestLayerWithoutBackward .......... .......... Done nnet.checklayer.TestLayerWithoutBackward __________ Test Summary: 20 Passed, 0 Failed, 0 Incomplete, 14 Skipped. Time elapsed: 0.1629 seconds.
The results show the number of passed, failed, and skipped tests. If you do not have a GPU, then the function skips the corresponding tests.
Check Function Layer Validity
Create a function layer object that applies the softsign operation to the input. The softsign operation is given by the function .
layer = functionLayer(@(X) X./(1 + abs(X)))
layer = FunctionLayer with properties: Name: '' PredictFcn: @(X)X./(1+abs(X)) Formattable: 0 Acceleratable: 0 Learnable Parameters No properties. State Parameters No properties. Use properties method to see a list of all properties.
Check that the layer it is valid using the checkLayer
function. Set the valid input size to the typical size of a single observation input to the layer. For example, for a single input, the layer expects observations of size h-by-w-by-c, where h, w, and c are the height, width, and number of channels of the previous layer output, respectively.
Specify validInputSize
as the typical size of an input array.
validInputSize = [5 5 20]; checkLayer(layer,validInputSize)
Skipping initialization tests. The layer does not have an initialize function. Skipping multi-observation tests. To enable tests with multiple observations, specify a formatted networkDataLayout as the second argument or specify the ObservationDimension option. For 2-D image data, set ObservationDimension to 4. For 3-D image data, set ObservationDimension to 5. For sequence data, set ObservationDimension to 2. Skipping GPU tests. No compatible GPU device found. Skipping code generation compatibility tests. To check validity of the layer for code generation, specify the CheckCodegenCompatibility and ObservationDimension options. Running nnet.checklayer.TestLayerWithoutBackward .......... .. Done nnet.checklayer.TestLayerWithoutBackward __________ Test Summary: 12 Passed, 0 Failed, 0 Incomplete, 22 Skipped. Time elapsed: 0.23257 seconds.
The results show the number of passed, failed, and skipped tests. If you do not specify the ObservationsDimension
option, or do not have a GPU, then the function skips the corresponding tests.
Check Multiple Observations
For multi-observation image input, the layer expects an array of observations of size h-by-w-by-c-by-N, where h, w, and c are the height, width, and number of channels, respectively, and N is the number of observations.
To check the layer validity for multiple observations, specify the typical size of an observation and set the ObservationDimension
option to 4.
layer = functionLayer(@(X) X./(1 + abs(X))); validInputSize = [5 5 20]; checkLayer(layer,validInputSize,ObservationDimension=4)
Skipping initialization tests. The layer does not have an initialize function. Skipping GPU tests. No compatible GPU device found. Skipping code generation compatibility tests. To check validity of the layer for code generation, specify the CheckCodegenCompatibility and ObservationDimension options. Running nnet.checklayer.TestLayerWithoutBackward .......... ........ Done nnet.checklayer.TestLayerWithoutBackward __________ Test Summary: 18 Passed, 0 Failed, 0 Incomplete, 16 Skipped. Time elapsed: 0.14452 seconds.
In this case, the function does not detect any issues with the layer.
Check Custom Layer for Code Generation Compatibility
Check the code generation compatibility of the custom layer codegenSReLULayer
.
The custom layer codegenSReLULayer
, attached to this is example as a supporting file, applies the SReLU operation to the input data. To access this layer, open this example as a live script.
Create an instance of the layer.
layer = codegenSReLULayer;
Create a networkDataLayout
object that specifies the expected input size and format of typical input to the layer. Specify a valid input size of [24 24 20 128]
, where the dimensions correspond to the height, width, number of channels, and number of observations of the previous layer output. Specify the format as "SSCB"
(spatial, spatial, channel, batch).
validInputSize = [24 24 20 128];
layout = networkDataLayout(validInputSize,"SSCB");
Check the layer validity using checkLayer. To check for code generation compatibility, set the CheckCodegenCompatibility
option to true
. The checkLayer
function does not check that the layer uses MATLAB functions that are compatible with code generation. To check that the custom layer definition is supported for code generation, first use the Code Generation Readiness app. For more information, see Check Code by Using the Code Generation Readiness Tool (MATLAB Coder).
checkLayer(layer,layout,CheckCodegenCompatibility=true)
Skipping GPU tests. No compatible GPU device found. Running nnet.checklayer.TestLayerWithoutBackward .......... .......... ..... Done nnet.checklayer.TestLayerWithoutBackward __________ Test Summary: 25 Passed, 0 Failed, 0 Incomplete, 9 Skipped. Time elapsed: 1.1221 seconds.
The function does not detect any issues with the layer.
Input Arguments
layer
— Layer to check
nnet.layer.Layer
object | FunctionLayer
Layer to check, specified as an nnet.layer.Layer
or
FunctionLayer
object.
If the layer has learnable or state parameters which require
initialization before layer
can be evaluated, or if the
layer has a custom initialize
function, then you must
specify a layout
or the layer must be initialized.
For an example showing how to define your own custom layer, see Define Custom Deep Learning Layer with Learnable Parameters. To
create a layer that applies a specified function, use functionLayer
.
layout1,...,layoutN
— Network data layouts
networkDataLayout
object
Since R2023b
Valid network data layouts for each input to the layer, specified as
networkDataLayout
objects.
For layers with a single input, specify a single
layout
.For layers with multiple inputs, specify a
layout
for each input. For example, for a layer with two inputs, specifylayout1,layout2
, wherelayout1
corresponds to the valid network data layout for the first input andlayout2
corresponds to the valid network data layout for the second input.
If the layer inherits from the nnet.layer.Formattable
class, you must specify a networkDataLayout
for each input
to the layer.
For large input sizes, the gradient checks take longer to run. To speed up
the check, specify a network data layout with a smaller size using the
Size
property.
validInputSize
— Valid input sizes
vector of positive integers | cell array of vectors of positive integers
Valid input sizes of the layer, specified as a vector of positive integers or cell array of vectors of positive integers.
For layers with a single input, specify
validInputSize
as a vector of integers corresponding to the dimensions of the input data. For example,[5 5 10]
corresponds to valid input data of size 5-by-5-by-10.For layers with multiple inputs, specify
validInputSize
as a cell array of vectors, where each vector corresponds to a layer input and the elements of the vectors correspond to the dimensions of the corresponding input data. For example,{[24 24 20],[24 24 10]}
corresponds to the valid input sizes of two inputs, where 24-by-24-by-20 is a valid input size for the first input and 24-by-24-by-10 is a valid input size for the second input.
For more information, see Layer Input Sizes.
For large input sizes, the gradient checks take longer to run. To speed up the check, specify a smaller valid input size.
Example: [5 5 10]
Example: {[24 24 20],[24 24 10]}
Data Types: single
| double
| int8
| int16
| int32
| int64
| uint8
| uint16
| uint32
| uint64
| cell
Name-Value Arguments
Specify optional pairs of arguments as
Name1=Value1,...,NameN=ValueN
, where Name
is
the argument name and Value
is the corresponding value.
Name-value arguments must appear after other arguments, but the order of the
pairs does not matter.
Example: ObservationDimension=4
sets the observation dimension
to 4
ObservationDimension
— Observation dimension
positive integer | row vector of positive integers
Observation dimension, specified as a positive integer or row vector
of positive integers. The default is the position of the batch
("B"
) dimensions of the network data layouts
layout1,...,layoutN
.
The observation dimension specifies which dimension of the layer input data corresponds to observations. For example, if the layer expects input data is of size h-by-w-by-c-by-N, where h, w, and c correspond to the height, width, and number of channels of the input data, respectively, and N corresponds to the number of observations, then the observation dimension is 4. For more information, see Layer Input Sizes.
If you specify a network data layout with a batch dimension or if you
specify the observation dimension, then the
checkLayer
function checks that the layer
functions are valid using generated data with mini-batches of size 1 and
2. Otherwise, the function skips the corresponding tests.
Example: 4
Example: [4 4 2]
Data Types: single
| double
| int8
| int16
| int32
| int64
| uint8
| uint16
| uint32
| uint64
CheckCodegenCompatibility
— Flag to enable code generation tests
0
(false) (default) | 1
(true)
Flag to enable code generation tests, specified as
0
(false) or 1
(true).
If CheckCodegenCompatibility
is
1
(true), then you must specify a layout
whose Format
property includes
a batch ("B"
) dimension or specify the ObservationDimension
option.
The CheckCodegenCompatibility
option does not
support layers that inherit from
nnet.layer.Formattable
. Instead, use the
analyzeNetworkForCodegen
(MATLAB Coder) function.
In addition, when generating code that uses third-party libraries:
Code generation supports custom layers with 2-D image or feature input only.
The inputs and output of the layer forward functions must have the same batch size.
Nonscalar properties must be a single, double, or character array.
Scalar properties must have type numeric, logical, or string.
The checkLayer
function does not check that functions used by the layer
are compatible with code generation. To check that functions used by the custom layer also
support code generation, first use the Code Generation Readiness app. For more
information, see Check Code by Using the Code Generation Readiness Tool (MATLAB Coder).
For an example showing how to define a custom layer that supports code generation, see Define Custom Deep Learning Layer for Code Generation.
Data Types: logical
More About
Layer Input Sizes
For each layer, the valid network data layout depends on the output of the previous layer.
Layer Input | Example | |
---|---|---|
Shape | Data Format | |
2-D images |
h-by-w-by-c-by-N numeric array, where h, w, c and N are the height, width, number of channels of the images, and number of observations, respectively. | "SSCB" |
3-D images | h-by-w-by-d-by-c-by-N numeric array, where h, w, d, c and N are the height, width, depth, number of channels of the images, and number of image observations, respectively. | "SSSCB" |
Vector sequences |
c-by-N-by-s matrix, where c is the number of features of the sequence, N is the number of sequence observations, and s is the sequence length. | "CBT" |
2-D image sequences |
h-by-w-by-c-by-N-by-s array, where h, w, and c correspond to the height, width, and number of channels of the image, respectively, N is the number of image sequence observations, and s is the sequence length. | "SSCBT" |
3-D image sequences |
h-by-w-by-d-by-c-by-N-by-s array, where h, w, d, and c correspond to the height, width, depth, and number of channels of the image, respectively, N is the number of image sequence observations, and s is the sequence length. | "SSSCBT" |
Features | c-by-N array, where c is the number of features, and N is the number of observations. | "CB" |
For example, for 2-D image classification problems, create a
networkDataLayout
object specifying the size as [h w c
n]
and the format as "SSCB"
, where
h
, w
, and c
correspond
to the height, width, and number of channels of the images, respectively, and
n
corresponds to the number of observations.
Code generation supports layers with 2-D image input only.
Algorithms
List of Tests
The checkLayer
function uses these tests to check the validity of custom
layers.
Test | Description |
---|---|
functionSyntaxesAreCorrect | The syntaxes of the layer functions are correctly defined. |
predictDoesNotError | predict function does not error. |
forwardDoesNotError | When specified, the |
forwardPredictAreConsistentInSize | When |
backwardDoesNotError | When specified, backward does not error. |
backwardIsConsistentInSize | When
|
predictIsConsistentInType | The outputs of |
forwardIsConsistentInType | When |
backwardIsConsistentInType | When |
gradientsAreNumericallyCorrect | When backward is specified, the gradients computed
in backward are consistent with the numerical
gradients. |
backwardPropagationDoesNotError | When backward is not specified, the derivatives
can be computed using automatic differentiation. |
predictReturnsValidStates | For layers with state properties, the predict
function returns valid states. |
forwardReturnsValidStates | For layers with state properties, the forward
function, if specified, returns valid states. |
resetStateDoesNotError | For layers with state properties, the resetState
function, if specified, does not error and resets the states to valid
states. |
| For layers that inherit from the
nnet.layer.Formattable class, the
predict function returns a formatted
dlarray with a channel dimension. |
| For layers that inherit from the
nnet.layer.Formattable class, the
forward function, if specified, returns a
formatted dlarray with a channel dimension. |
| When you specify one or more networkDataLayout
objects, the learnable parameters of the layer do not change after
repeated initialization with the same
networkDataLayout objects as input. |
| When you specify one or more networkDataLayout
objects, the state parameters of the layer do not change after repeated
initialization with the same networkDataLayout
objects as input. |
codegenPragmaDefinedInClassDef | The pragma "%#codegen" for code generation is
specified in class file. |
layerPropertiesSupportCodegen | The layer properties support code generation. |
predictSupportsCodegen | predict is valid for code generation. |
doesNotHaveStateProperties | For code generation, the layer does not have state properties. |
functionLayerSupportsCodegen | For code generation, the layer function must be a named function on
the path and the Formattable property must be
0 (false). |
Some tests run multiple times. These tests also check different data types and for GPU compatibility:
predictIsConsistentInType
forwardIsConsistentInType
backwardIsConsistentInType
To execute the layer functions on a GPU, the functions must support inputs and outputs of
type gpuArray
with the underlying data type
single
.
For more information on the tests used by checkLayer
, see
Check Custom Layer Validity.
Version History
Introduced in R2018aR2024a: Custom output layers are not recommended
Custom output layers are not recommended. Use a custom loss function in the
trainnet
function instead.
This recommendation means that these syntaxes are not recommended for custom output layer input:
checkLayer(layer,layout1,...,layoutN)
checkLayer(layer,validInputSize)
checkLayer(___,Name=Value)
There are no plans to remove support for custom output layers. However, the
trainnet
function has these advantages and is recommended
instead:
trainnet
supportsdlnetwork
objects, which support a wider range of network architectures that you can create or import from external platforms.trainnet
enables you to easily specify loss functions. You can select from built-in loss functions or specify a custom loss function.trainnet
outputs adlnetwork
object, which is a unified data type that supports network building, prediction, built-in training, visualization, compression, verification, and custom training loops.trainnet
is typically faster thantrainNetwork
.
This table shows some typical usages of the trainNetwork
function with custom output layers and how to update your code to use the
trainnet
function instead.
Not Recommended | Recommended |
---|---|
net = trainNetwork(X,T,layers,options) ,
where layers contains a custom output
layer. | net = trainnet(X,T,layers,lossFcn,options); layers specifies same
network without the custom output layer and
lossFcn is a function handle the
specifies the custom loss function. |
net = trainNetwork(data,layers,options) ,
where layers contains a custom output
layer. | net = trainnet(data,layers,lossFcn,options); layers specifies same
network without the custom output layer and
lossFcn is a function handle the
specifies the custom loss function. |
R2023b: Check formattable custom layers and check custom layers without initializing
You can now use the checkLayer
function to check the validity of custom layers that
inherit from the nnet.layer.Formattable
class by specifying a
networkDataLayoutObject
as the second argument.
You can also check the validity of a custom layer with a custom initialize
function without first initializing the layer by using the
checkLayer
function and specifying a networkDataLayout
as the second argument.
See Also
networkDataLayout
| trainnet
| trainingOptions
| dlnetwork
| analyzeNetwork
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)
Asia Pacific
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)