How to use sigmoid layer?

29 Ansichten (letzte 30 Tage)
REN Jain
REN Jain am 2 Nov. 2020
Kommentiert: Z am 23 Sep. 2022
Hello
I am creating a neural network for binary classification.
While executing trainNetwork , this is giving me en error saying that a classificationLayer must be preceded by a softmax layer.
What changes shoul I make? I can use softmax layer with fullyconnectedLayer(2) but using a sigmoidLayer for binary classification will be much more efficint. So after the sigmoidLayer which layer am i supposed to use?
layers = [ ...
sequenceInputLayer(inputSize)
lstmLayer(numHiddenUnits,'OutputMode','last')
fullyConnectedLayer(1)
sigmoidLayer
classificationLayer];
  2 Kommentare
Ankit Pasi
Ankit Pasi am 15 Mai 2021
I have the exact same situation and question. Sadly the deep learning community within Matlab is few to none and this like other similar questions go unanswered...
Z
Z am 23 Sep. 2022
@REN Jain@Ankit Pasi Were you able solve this issue?

Melden Sie sich an, um zu kommentieren.

Antworten (1)

Abolfazl Chaman Motlagh
Abolfazl Chaman Motlagh am 8 Dez. 2021
The Classification Layer assign a Class to highest probability using cross-entropy loss. so the output of the layer before classificationLayer should be a descrete probability distribution. output of a fullyConnectedLayer even with some nonlinear activation like sigmoid doesn't create a probability distribution. The softmax function provide such an output.
also you can create your own type of activation function for classification. for example
but in standard creation of deep network in matlab the layer classificationLayer must be preceded by a softmaxlayer as you said. if you want to create your own classification or activation function to yield a probability distribution you should create your own costum layer.
but your set of layers for your problem has some issues. if your network should perform a classification task, your last layer before classification should have same number of output as your classification problem. you want a binary classification which means 2 class, so last fullyConnectedLayer should has 2 output. and finally because of above reason you should use a softmax layer before classificationLayer.
a simple correction for your layers would be :
layers = [ ...
sequenceInputLayer(inputSize)
lstmLayer(numHiddenUnits,'OutputMode','last')
fullyConnectedLayer(n1) % e.g. n1=10
sigmoidLayer
fullyConnectedLayer(2)
softmaxLayer
classificationLayer];
or
layers = [ ...
sequenceInputLayer(inputSize)
lstmLayer(numHiddenUnits,'OutputMode','last')
fullyConnectedLayer(2)
sigmoidLayer
softmaxLayer
classificationLayer];

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by