Main Content

Build Deep Neural Networks

Build networks for sequence and tabular data using MATLAB® code or interactively using Deep Network Designer

Create new deep networks for tasks such as classification, regression, and forecasting by defining the network architecture from scratch. Build networks using MATLAB or interactively using Deep Network Designer.

For most tasks, you can use built-in layers. If there is not a built-in layer that you need for your task, then you can define your own custom layer. You can specify a custom loss function using a custom output layer and define custom layers with learnable and state parameters. After defining a custom layer, you can check that the layer is valid, GPU compatible, and outputs correctly defined gradients. For a list of supported layers, see List of Deep Learning Layers.

For models that cannot be specified as networks of layers, you can define the model as a function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks.

Apps

Deep Network DesignerDesign and visualize deep learning networks

Funktionen

alle erweitern

Input Layers

sequenceInputLayerSequence input layer
featureInputLayerFeature input layer (Seit R2020b)

Recurrent Layers

lstmLayerLong short-term memory (LSTM) layer for recurrent neural network (RNN)
bilstmLayerBidirectional long short-term memory (BiLSTM) layer for recurrent neural network (RNN)
gruLayerGated recurrent unit (GRU) layer for recurrent neural network (RNN) (Seit R2020a)
lstmProjectedLayerLong short-term memory (LSTM) projected layer for recurrent neural network (RNN) (Seit R2022b)
gruProjectedLayerGated recurrent unit (GRU) projected layer for recurrent neural network (RNN) (Seit R2023b)

Transformer Layers

selfAttentionLayerSelf-attention layer (Seit R2023a)
attentionLayerDot-product attention layer (Seit R2024a)
positionEmbeddingLayerPosition embedding layer (Seit R2023b)
sinusoidalPositionEncodingLayerSinusoidal position encoding layer (Seit R2023b)
embeddingConcatenationLayerEmbedding concatenation layer (Seit R2023b)
indexing1dLayer1-D indexing layer (Seit R2023b)

Neural ODE Layers

neuralODELayerNeural ODE layer (Seit R2023b)

Convolution and Fully Connected Layers

convolution1dLayer1-D convolutional layer (Seit R2021b)
transposedConv1dLayerTransposed 1-D convolution layer (Seit R2022a)
fullyConnectedLayerFully connected layer

Activation and Dropout Layers

reluLayerRectified Linear Unit (ReLU) layer
leakyReluLayerLeaky Rectified Linear Unit (ReLU) layer
preluLayerParametrized Rectified Linear Unit (PReLU) layer (Seit R2024a)
clippedReluLayerClipped Rectified Linear Unit (ReLU) layer
eluLayerExponential linear unit (ELU) layer
tanhLayerHyperbolic tangent (tanh) layer
swishLayerSwish layer (Seit R2021a)
geluLayerGaussian error linear unit (GELU) layer (Seit R2022b)
sigmoidLayerSigmoid layer (Seit R2020b)
softmaxLayerSoftmax layer
dropoutLayerDropout layer
functionLayerFunction layer (Seit R2021b)

Normalization Layers

batchNormalizationLayerBatch normalization layer
groupNormalizationLayerGroup normalization layer (Seit R2020b)
instanceNormalizationLayerInstance normalization layer (Seit R2021a)
layerNormalizationLayerLayer normalization layer (Seit R2021a)
crossChannelNormalizationLayer Channel-wise local response normalization layer

Pooling Layers

maxPooling1dLayer1-D max pooling layer (Seit R2021b)
averagePooling1dLayer1-D average pooling layer (Seit R2021b)
globalMaxPooling1dLayer1-D global max pooling layer (Seit R2021b)
globalAveragePooling1dLayer1-D global average pooling layer (Seit R2021b)

Combination Layers

additionLayerAddition layer
multiplicationLayerMultiplication layer (Seit R2020b)
concatenationLayerConcatenation layer
depthConcatenationLayerDepth concatenation layer

Data Manipulation

sequenceFoldingLayer(Not recommended) Sequence folding layer
sequenceUnfoldingLayer(Not recommended) Sequence unfolding layer
flattenLayerFlatten layer
dlnetworkDeep learning neural network (Seit R2019b)
addLayersAdd layers to neural network
removeLayersRemove layers from neural network
replaceLayerReplace layer in neural network
connectLayersConnect layers in neural network
disconnectLayersDisconnect layers in neural network
addInputLayerAdd input layer to network (Seit R2022b)
initializeInitialize learnable and state parameters of a dlnetwork (Seit R2021a)
networkDataLayoutDeep learning network data layout for learnable parameter initialization (Seit R2022b)
setL2FactorSet L2 regularization factor of layer learnable parameter
getL2FactorGet L2 regularization factor of layer learnable parameter
setLearnRateFactorSet learn rate factor of layer learnable parameter
getLearnRateFactorGet learn rate factor of layer learnable parameter
plotPlot neural network architecture
summaryPrint network summary (Seit R2022b)
analyzeNetworkAnalyze deep learning network architecture
checkLayerCheck validity of custom or function layer
isequalCheck equality of neural networks (Seit R2021a)
isequalnCheck equality of neural networks ignoring NaN values (Seit R2021a)

Themen

Built-In Layers

Custom Layers