Feedforward net - how to use LeakyReLU or scaled exponential linear unit for the hidden layers?

2 views (last 30 days)
Ramakrishnan Raman
Ramakrishnan Raman on 8 Oct 2018
In a multi-layer shallow network using feedforwardnet, how to use different activation functions like Leaky ReLU or Scaled exponential linear unit in the hidden layers? The default function supported seem to be only tansig for the hidden layers.

Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by