How to display weight distribution in hidden layers of neural network?
3 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I have 8 inputs in the input layer.Now i want to display weight distribution of these 8 inputs in hidden layer to observe the importance of features.To make it more clear example is shown in figure ( https://pasteboard.co/GKCpA6Q.png ).I used `plotwb` function of Matlab it didn't display the weights of every input.
Actually i want to look at weights(weights connecting inputs to first hidden layer) . Larger the weight is, the more important the input.
0 Kommentare
Antworten (1)
Greg Heath
am 17 Sep. 2017
That will not work. It does not account for the correlations between inputs.
The best way to rank correlated inputs is
1. Use NO HIDDEN LAYERS !
2. Run 10 or more trials each (different random initial weights)
using
a. A single input
b. All inputs except the one in a.
Hope this helps.
Thank you for formally accepting my answer
Greg
0 Kommentare
Siehe auch
Kategorien
Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!