You can define your own custom deep learning layer for your problem. You can specify a custom loss function using a custom output layers and define custom layers with or without learnable parameters. After defining a custom layer, you can check that the layer is valid, GPU compatible, and outputs correctly defined gradients.
Define Custom Deep Learning Layers
Learn how to define custom deep learning layers.
Define Custom Deep Learning Layer with Learnable Parameters
This example shows how to define a PReLU layer and use it in a convolutional neural network.
Define Custom Deep Learning Layer with Multiple Inputs
This example shows how to define a custom weighted addition layer and use it in a convolutional neural network.
Define Custom Deep Learning Layer with Formatted Inputs
This example shows how to define a custom layer with formatted
dlarray
inputs.
Specify Custom Layer Backward Function
This example shows how to define a PReLU layer and specify a custom backward function.
Define Custom Deep Learning Layer for Code Generation
This example shows how to define a PReLU layer that supports code generation.
Define Custom Classification Output Layer
This example shows how to define a custom classification output layer with sum of squares error (SSE) loss and use it in a convolutional neural network.
Define Custom Regression Output Layer
This example shows how to define a custom regression output layer with mean absolute error (MAE) loss and use it in a convolutional neural network.
Specify Custom Output Layer Backward Loss Function
This example shows how to define a custom classification output layer with sum of squares error (SSE) loss and specify a custom backward loss function.
Deep Learning Network Composition
Define custom layers containing layer graphs.
Define Nested Deep Learning Layer
This example shows how to define a nested deep learning layer.
Train Deep Learning Network with Nested Layers
This example shows how to train a network with nested layers.
Learn how to check the validity of custom deep learning layers.