Main Content

Build Deep Neural Networks

Build networks using command-line functions or interactively using the Deep Network Designer app

Build networks from scratch using MATLAB® code or interactively using the Deep Network Designer app. Use built-in layers to construct networks for tasks such as classification and regression. To see a list of built-in layers, see List of Deep Learning Layers. You can then analyze your network to understand the network architecture and check for problems before training.

If the built-in layers do not provide the layer that you need for your task, then you can define your own custom deep learning layer. You can specify a custom loss function using a custom output layers and define custom layers with or without learnable parameters. After defining a custom layer, you can check that the layer is valid, GPU compatible, and outputs correctly defined gradients.

For networks that cannot be created using layer graphs, you can define a custom network as a function. For an example showing how to train a deep learning model defined as a function, see Train Network Using Model Function.

Apps

Deep Network DesignerDesign, visualize, and train deep learning networks

Funktionen

alle erweitern

Input Layers

imageInputLayerImage input layer
image3dInputLayer3-D image input layer (Seit R2019a)
sequenceInputLayerSequence input layer
featureInputLayerFeature input layer (Seit R2020b)

Convolution and Fully Connected Layers

convolution2dLayer2-D convolutional layer
convolution3dLayer3-D convolutional layer (Seit R2019a)
groupedConvolution2dLayer2-D grouped convolutional layer (Seit R2019a)
transposedConv2dLayerTransposed 2-D convolution layer
transposedConv3dLayerTransposed 3-D convolution layer (Seit R2019a)
fullyConnectedLayerFully connected layer

Recurrent Layers

lstmLayerLong short-term memory (LSTM) layer for recurrent neural network (RNN)
bilstmLayerBidirectional long short-term memory (BiLSTM) layer for recurrent neural network (RNN)
gruLayerGated recurrent unit (GRU) layer for recurrent neural network (RNN) (Seit R2020a)
lstmProjectedLayerLong short-term memory (LSTM) projected layer for recurrent neural network (RNN) (Seit R2022b)
gruProjectedLayerGated recurrent unit (GRU) projected layer for recurrent neural network (RNN) (Seit R2023b)

Transformer Layers

selfAttentionLayerSelf-attention layer (Seit R2023a)
positionEmbeddingLayerPosition embedding layer (Seit R2023b)
sinusoidalPositionEncodingLayerSinusoidal position encoding layer (Seit R2023b)
embeddingConcatenationLayerEmbedding concatenation layer (Seit R2023b)
indexing1dLayer1-D indexing layer (Seit R2023b)

Neural ODE Layers

neuralODELayerNeural ODE layer (Seit R2023b)

Activation Layers

reluLayerRectified Linear Unit (ReLU) layer
leakyReluLayerLeaky Rectified Linear Unit (ReLU) layer
clippedReluLayerClipped Rectified Linear Unit (ReLU) layer
eluLayerExponential linear unit (ELU) layer (Seit R2019a)
tanhLayerHyperbolic tangent (tanh) layer (Seit R2019a)
swishLayerSwish layer (Seit R2021a)
geluLayerGaussian error linear unit (GELU) layer (Seit R2022b)
softmaxLayerSoftmax layer
sigmoidLayerSigmoid layer (Seit R2020b)
functionLayerFunction layer (Seit R2021b)

Normalization Layers

batchNormalizationLayerBatch normalization layer
groupNormalizationLayerGroup normalization layer (Seit R2020b)
instanceNormalizationLayerInstance normalization layer (Seit R2021a)
layerNormalizationLayerLayer normalization layer (Seit R2021a)
crossChannelNormalizationLayer Channel-wise local response normalization layer

Utility Layers

dropoutLayerDropout layer
crop2dLayer2-D crop layer
crop3dLayer3-D crop layer (Seit R2019b)

Data Manipulation

sequenceFoldingLayerSequence folding layer (Seit R2019a)
sequenceUnfoldingLayerSequence unfolding layer (Seit R2019a)
flattenLayerFlatten layer (Seit R2019a)

Pooling and Unpooling Layers

averagePooling2dLayerAverage pooling layer
averagePooling3dLayer3-D average pooling layer (Seit R2019a)
globalAveragePooling2dLayer2-D global average pooling layer (Seit R2019b)
globalAveragePooling3dLayer3-D global average pooling layer (Seit R2019b)
globalMaxPooling2dLayerGlobal max pooling layer (Seit R2020a)
globalMaxPooling3dLayer3-D global max pooling layer (Seit R2020a)
maxPooling2dLayerMax pooling layer
maxPooling3dLayer3-D max pooling layer (Seit R2019a)
maxUnpooling2dLayerMax unpooling layer

Combination Layers

additionLayerAddition layer
multiplicationLayerMultiplication layer (Seit R2020b)
concatenationLayerConcatenation layer (Seit R2019a)
depthConcatenationLayerDepth concatenation layer

Output Layers

classificationLayerClassification output layer
regressionLayerRegression output layer
layerGraphGraph of network layers for deep learning
plotPlot neural network architecture
addLayersAdd layers to layer graph or network
removeLayersRemove layers from layer graph or network
replaceLayerReplace layer in layer graph or network
connectLayersConnect layers in layer graph or network
disconnectLayersDisconnect layers in layer graph or network
DAGNetworkDirected acyclic graph (DAG) network for deep learning
resnetLayersCreate 2-D residual network (Seit R2021b)
resnet3dLayersCreate 3-D residual network (Seit R2021b)
isequalCheck equality of deep learning layer graphs or networks (Seit R2021a)
isequalnCheck equality of deep learning layer graphs or networks ignoring NaN values (Seit R2021a)
analyzeNetworkAnalyze deep learning network architecture
resetStateReset state parameters of neural network
dlnetworkDeep learning network for custom training loops (Seit R2019b)
addInputLayerAdd input layer to network (Seit R2022b)
summaryPrint network summary (Seit R2022b)
initializeInitialize learnable and state parameters of a dlnetwork (Seit R2021a)
networkDataLayoutDeep learning network data layout for learnable parameter initialization (Seit R2022b)
checkLayerCheck validity of custom or function layer
setL2FactorSet L2 regularization factor of layer learnable parameter
getL2FactorGet L2 regularization factor of layer learnable parameter
setLearnRateFactorSet learn rate factor of layer learnable parameter
getLearnRateFactorGet learn rate factor of layer learnable parameter

Themen

Built-In Layers

Custom Layers