does not provide the training options that you need for your task, or
custom output layers do not support the loss functions that you need,
then you can define a custom training loop. For networks that cannot be
created using layer graphs, you can define custom networks as a
function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks.
|Deep learning network for custom training loops|
|Compute deep learning network output for training|
|Compute deep learning network output for inference|
|Update parameters using adaptive moment estimation (Adam)|
|Update parameters using root mean squared propagation (RMSProp)|
|Update parameters using stochastic gradient descent with momentum (SGDM)|
|Update parameters using custom function|
|Create mini-batches for deep learning|
|Encode data labels into one-hot vectors|
|Decode probability vectors into class labels|
|Deep learning array for custom training loops|
|Compute gradients for custom training loops using automatic differentiation|
|Evaluate deep learning model for custom training loops|
|Dimension labels of |
|Find dimensions with specified label|
|Extract data from |
|Determine whether input is |
|Convert deep learning model function to a layer graph|
|Deep learning convolution|
|Deep learning transposed convolution|
|Long short-term memory|
|Gated recurrent unit|
|Embed discrete data|
|Sum all weighted input data and apply a bias|
|Apply rectified linear unit activation|
|Apply leaky rectified linear unit activation|
|Normalize each channel of mini-batch|
|Cross channel square-normalize using local responses|
|Normalize activations across groups of channels|
|Pool data to average values over spatial dimensions|
|Pool data to maximum value|
|Unpool the output of a maximum pooling operation|
|Apply softmax activation to channel dimension|
|Cross-entropy loss for classification tasks|
|Apply sigmoid activation|
|Half mean squared error|
Learn how to training deep learning models in MATLAB®.
Learn how to define and customize deep learning training loops, loss functions, and networks using automatic differentiation.
This example shows how to train a network that classifies handwritten digits with a custom learning rate schedule.
Learn how to specify common training options in a custom training loop.
Learn how to define a model gradients function for a custom training loop.
This example shows how to update the network state in a custom training loop.
This example shows how to make predictions using a
dlnetwork object by splitting data into mini-batches.
This example shows how to train a network that classifies handwritten digits using both image and feature input data.
This example shows how to train a deep learning network with multiple outputs that predict both labels and angles of rotations of handwritten digits.
This example shows how to create and train a deep learning network by using functions rather than a layer graph or a
This example shows how to update the network state in a network defined as a function.
This example shows how to make predictions using a model function by splitting data into mini-batches.
Learn how to initialize learnable parameters for custom training loops using a model function.
View the list of functions that support
Learn how automatic differentiation works.
How to use automatic differentiation in deep learning.