Filter löschen
Filter löschen

Use relu function for lstmlayer

1 Ansicht (letzte 30 Tage)
Tomohiro Oka
Tomohiro Oka am 26 Jul. 2019
Beantwortet: slevin Lee am 21 Okt. 2022
I would like to change the StateActivationFuction of lstmLayer to Relu fuction, but only 'tanh' and 'softsign' are supported in the deep learning tool box.
Is there any solutions for changing the activation function ,or the way to make customed lstmLayer with Relu as the StateActivation?

Antworten (1)

slevin Lee
slevin Lee am 21 Okt. 2022
GateActivationFunction — Activation function to apply to the gates
  • 'sigmoid' – Use the sigmoid function σ(x)=(1+ex)1.
  • 'hard-sigmoid' – Use the hard sigmoid function
no Relu fuction
╮(╯▽╰)╭

Produkte

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by