preluLayer
Description
A PReLU layer performs a threshold operation, where for each channel, any input value less than zero is multiplied by a scalar learned at training time.
This operation is equivalent to:
Creation
Properties
Examples
Algorithms
References
[1] Maas, Andrew L., Awni Y. Hannun, and Andrew Y. Ng. "Rectifier nonlinearities improve neural network acoustic models." In Proc. ICML, vol. 30, no. 1. 2013.
Extended Capabilities
Version History
Introduced in R2024a
See Also
trainnet
| trainingOptions
| leakyReluLayer
| dlnetwork
| reluLayer