Main Content

# relu

Apply rectified linear unit activation

## Syntax

``dlY = relu(dlX)``

## Description

The rectified linear unit (ReLU) activation operation performs a nonlinear threshold operation, where any input value less than zero is set to zero.

This operation is equivalent to

`$f\left(x\right)=\left\{\begin{array}{cc}x,& x>0\\ 0,& x\le 0.\end{array}$`

Note

This function applies the ReLU operation to `dlarray` data. If you want to apply ReLU activation within a `layerGraph` object or `Layer` array, use the following layer:

example

````dlY = relu(dlX)` computes the ReLU activation of the input `dlX` by applying a threshold operation. All values in `dlX` that are less than zero are set to zero.```

## Examples

collapse all

Use the `relu` function to set negative values in the input data to zero.

Create the input data as a single observation of random values with a height and width of 12 and 32 channels.

```height = 12; width = 12; channels = 32; observations = 1; X = randn(height,width,channels,observations); dlX = dlarray(X,'SSCB');```

Compute the leaky ReLU activation.

`dlY = relu(dlX);`

All negative values in `dlX` are now set to `0`.

## Input Arguments

collapse all

Input data, specified as a `dlarray` with or without dimension labels.

Data Types: `single` | `double`

## Output Arguments

collapse all

ReLU activations, returned as a `dlarray`. The output `dlY` has the same underlying data type as the input `dlX`.

If the input data `dlX` is a formatted `dlarray`, `dlY` has the same dimension labels as `dlX`. If the input data is not a formatted `dlarray`, `dlY` is an unformatted `dlarray` with the same dimension order as the input data.

## See Also

### Topics

Introduced in R2019b

Download ebook