Main Content

Custom Training Loops

Customize deep learning training loops and loss functions

If the trainingOptions function does not provide the training options that you need for your task, or custom output layers do not support the loss functions that you need, then you can define a custom training loop. For models that layer graphs do not support, you can define a custom model as a function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks.

Functions

expand all

dlnetworkDeep learning network for custom training loops
resetStateReset state parameters of neural network
plotPlot neural network architecture
addInputLayerAdd input layer to network
addLayersAdd layers to layer graph or network
removeLayersRemove layers from layer graph or network
connectLayersConnect layers in layer graph or network
disconnectLayersDisconnect layers in layer graph or network
replaceLayerReplace layer in layer graph or network
summaryPrint network summary
initializeInitialize learnable and state parameters of a dlnetwork
networkDataLayoutDeep learning network data layout for learnable parameter initialization
layerGraphGraph of network layers for deep learning
setL2FactorSet L2 regularization factor of layer learnable parameter
getL2FactorGet L2 regularization factor of layer learnable parameter
setLearnRateFactorSet learn rate factor of layer learnable parameter
getLearnRateFactorGet learn rate factor of layer learnable parameter
forwardCompute deep learning network output for training
predictCompute deep learning network output for inference
adamupdateUpdate parameters using adaptive moment estimation (Adam)
rmspropupdate Update parameters using root mean squared propagation (RMSProp)
sgdmupdate Update parameters using stochastic gradient descent with momentum (SGDM)
lbfgsupdateUpdate parameters using limited-memory BFGS (L-BFGS)
lbfgsStateState of limited-memory BFGS (L-BFGS) solver
dlupdate Update parameters using custom function
trainingProgressMonitorMonitor and plot training progress for deep learning custom training loops
updateInfoUpdate information values for custom training loops
recordMetricsRecord metric values for custom training loops
groupSubPlotGroup metrics in training plot
padsequencesPad or truncate sequence data to same length
minibatchqueueCreate mini-batches for deep learning
onehotencodeEncode data labels into one-hot vectors
onehotdecodeDecode probability vectors into class labels
nextObtain next mini-batch of data from minibatchqueue
resetReset minibatchqueue to start of data
shuffleShuffle data in minibatchqueue
hasdataDetermine if minibatchqueue can return mini-batch
partitionPartition minibatchqueue
dlarrayDeep learning array for customization
dlgradientCompute gradients for custom training loops using automatic differentiation
dlfevalEvaluate deep learning model for custom training loops
dimsDimension labels of dlarray
finddimFind dimensions with specified label
stripdimsRemove dlarray data format
extractdataExtract data from dlarray
isdlarrayCheck if object is dlarray
crossentropyCross-entropy loss for classification tasks
l1lossL1 loss for regression tasks
l2lossL2 loss for regression tasks
huberHuber loss for regression tasks
mseHalf mean squared error
ctcConnectionist temporal classification (CTC) loss for unaligned sequence classification
dlaccelerateAccelerate deep learning function for custom training loops
AcceleratedFunctionAccelerated deep learning function
clearCacheClear accelerated deep learning function trace cache

Topics

Custom Training Loops

Automatic Differentiation

Deep Learning Function Acceleration

Related Information