- 'sigmoid' – Use the sigmoid function σ(x)=(1+e−x)−1.
- 'hard-sigmoid' – Use the hard sigmoid function
Use relu function for lstmlayer
5 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I would like to change the StateActivationFuction of lstmLayer to Relu fuction, but only 'tanh' and 'softsign' are supported in the deep learning tool box.
Is there any solutions for changing the activation function ,or the way to make customed lstmLayer with Relu as the StateActivation?
0 Kommentare
Antworten (1)
slevin Lee
am 21 Okt. 2022
GateActivationFunction — Activation function to apply to the gates
no Relu fuction
╮(╯▽╰)╭
0 Kommentare
Siehe auch
Kategorien
Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange
Produkte
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!