How can I constrain neural network weights?

2 Ansichten (letzte 30 Tage)
Luke Wilhelm
Luke Wilhelm am 7 Dez. 2012
Beantwortet: Sara Perez am 12 Sep. 2019
I am using the neural network toolbox to create a feed forward network. The input is one 4x1 vector, then there is one 4-neuron hidden layer, one 6-neuron hidden layer, and one 4-neuron output layer. I would like to be able to constrain the final 4x6 matrix of layer weights such that the weight values cannot be negative. I realize that this will probably affect the network's accuracy, but for the purpose of my research, I would like to see what the results are.
Is it possible to constrain the layer weights in this way? I have found how to set layer weights to a specified value and prevent their learning using net.layerWeights{i,j}.learn=false;, but not how to allow wights to change, while preventing them from becoming negative.
Thanks, Luke
  1 Kommentar
Greg Heath
Greg Heath am 9 Dez. 2012
One hidden layer is sufficient for a universal approximator.
If the hidden node activation functions are all odd, changing the sign of all weights connected to one activation function will not change the output.
Therefore, if there is only one output node, the task is easy.
Otherwise, it will not work in general.

Melden Sie sich an, um zu kommentieren.

Antworten (2)

R L
R L am 24 Jul. 2015
I would like to ask you how did you a subset of the layer weights to a specified value while preventing their learning using net.layerWeights{i,j}.learn=false.
Have you ever solved your question regarding constraining the weights to have a specified sign while training with learning? thanks

Sara Perez
Sara Perez am 12 Sep. 2019
You can set the propiety value of the layer 'WeightLearnRateFactor' to zero, so the weights won't be modified or learned
more info here:

Kategorien

Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by