Filter löschen
Filter löschen

Why is the size of the input weight matrix sometimes smaller than the input length when training a neural network?

8 Ansichten (letzte 30 Tage)
I have a question regarding the size of the inut weight matrix for a neural network. My IW Matrix is smaller than expected and I don't know why. What I do:
net=patternnet(1);
[net,tr]=train(net,inputs,targets);
net.IW %size of the input weight matrices
ans =
[1x14 double]
[]
net.inputs.size %size of my inputs
ans =
[15]
net.layers.size %size of my hidden and output layer
ans =
[1]
[2]
As far as I understood, the size of my input weight matrix should be 1 (size of hidden layer) by 15 (length of input vectors). I tried it several times with different input sizes, but the size of IW sometimes is equal or 1-2 smaller than my input size.
I want to know why this happens and how I can match the weights to the input variables. Thanks in advance, Antje

Akzeptierte Antwort

Antje
Antje am 6 Sep. 2012
Oh, finally I got it!
The problem was, that there were some redundant columns in the inputs. I have not seen them, because I have a huge amount of data.
It seems that the training process ignores these columns, but I could not see which of them.
If I get rid of them before the training, than there are no deviations in size of the input and size of the weight matrix.
  5 Kommentare
enjy fikry
enjy fikry am 5 Mai 2017
how can i stop that from happening ? i don't want the training process to ignore these constant columns
Greg Heath
Greg Heath am 5 Mai 2017
You should.
They have zero variance.
Therefore they cannot contribute to learning.
However, they can confuse those who do not understand this.
Hope this helps.
Greg

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Kategorien

Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by