# relu

Apply rectified linear unit activation

## Description

The rectified linear unit (ReLU) activation operation performs a nonlinear threshold operation, where any input value less than zero is set to zero.

This operation is equivalent to

$f\left(x\right)=\left\{\begin{array}{cc}x,& x>0\\ 0,& x\le 0.\end{array}$

Note

This function applies the ReLU operation to dlarray data. If you want to apply the ReLU activation within a layerGraph object or Layer array, use the following layer:

example

Y = relu(X) computes the ReLU activation of the input X by applying a threshold operation. All values in X that are less than zero are set to zero.

## Examples

collapse all

Create a formatted dlarray object containing a batch of 128 28-by-28 images with 3 channels. Specify the format 'SSCB' (spatial, spatial, channel, batch).

miniBatchSize = 128;
inputSize = [28 28];
numChannels = 3;
X = rand(inputSize(1),inputSize(2),numChannels,miniBatchSize);
X = dlarray(X,"SSCB");

View the size and format of the input data.

size(X)
ans = 1×4

28    28     3   128

dims(X)
ans =
'SSCB'

Apply the ReLU operation using the relu function.

Y = relu(X);

View the size and format of the output.

size(Y)
ans = 1×4

28    28     3   128

dims(Y)
ans =
'SSCB'

## Input Arguments

collapse all

Input data, specified as a formatted or unformatted dlarray object.

## Output Arguments

collapse all

ReLU activations, returned as a dlarray. The output Y has the same underlying data type as the input X.

If the input data X is a formatted dlarray, Y has the same dimension format as X. If the input data is not a formatted dlarray, Y is an unformatted dlarray with the same dimension order as the input data.

## Version History

Introduced in R2019b