Set a specific weight for a connection in neural networks
32 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I have already build my neural network. I wanna set a specific weight for the conection from layer i to layer j
foe example ...set weights from the inputs to the outputs = 1
How can i do this in MATLAB ?
X=Calculations('comp2real' , PA_in); % convert nx1 complex vector to nx2 real vector
T=Calculations('comp2real' , PA_out);
%% Network structure
net = feedforwardnet(20);
% net.numInputs=2
% net.layers{2}.size=2;
net.biasConnect=[1;0];
% net.inputWeights{2,1}.weight=1;
net.inputWeights{2,1}.learn=0;
net.layers{1}.transferFcn = 'poslin';
% net.inputConnect=[1 1;1 1]
net.inputConnect=[1;1];
[net tr] = train(net,X,T);
plotperform(tr)
view(net)
wb=getwb(net);
Param_num=length(wb)
Evaluation
Y = net(X);
perf = perform(net,T,Y)
1 Kommentar
Adam Danz
am 9 Jan. 2020
Have you tried to search for the answer to this question? Google has returned some useful examples and starting points to solving this. Searching the matlab documentation directly is also helpful. Without providing more context, how you created the neural network, etc; we're shooting in the dark. I'd be interested in hearing what solutions you've found and where we can help implement them.
Antworten (2)
Srivardhan Gadila
am 24 Jan. 2020
If by "I have already build my neural network" you imply that
1. The network architecture is defined and has to be trained:
Then you can access the layer weights as follows:
net.LW{i,j}
You can set any values to the above weights and set the net.layerWeights{i,j}.learn to 0 so that the weights won't be altered during the training & adaption. In this case setting a specific weight for a connection is not possible since the property net.layerWeights{i,j}.learn is defined for the entire connections between layers i and j.
net.layerWeights{i,j}.learn = 0
net.LW{i,j} = ones(size(net.LW{i,j})) % any weights of size(net.LW{i,j})
2. The network architecture is defined and trained already:
Then you can set weight of a connection between nodes k & l of layers i & j as follows:
net.LW{i,j}(k,l) = 1
and then use the network.
The above things can be done to Input wieghts too.
1 Kommentar
LukasJ
am 7 Sep. 2020
I think my Issue with the updating of the weights could be related.
Thanks in advance and best regards,
Lukas
Matthew Heberger
am 2 Mai 2022
Bearbeitet: Matthew Heberger
am 2 Mai 2022
I found that it was difficult to set fixed Input Weights for a custom feedforward network (in Matlab 2022a). I wanted to set the weight from input 10 to layer 25 to -1, and for layer 25 to have a bias of 0, and used the following code:
net.biases{25}.learn = false;
net.b{25} = 0;
net.inputWeights{25, 10}.learn = false;
net.IW{25, 10} = -1; % This line caused an error
This gave the following error message at runtime:
Error using network/subsasgn>network_subsasgn
net.IW{25,10} must be a 1-by-0 matrix.
A colleague and I discovered that if we initialized the network first, then we can set the input weights:
net = configure(net, X, T); %configure the network first
net.biases{25}.learn = false;
net.b{25} = 0;
net.inputWeights{25, 10}.learn = false;
net.IW{25, 10} = -1; % Runs OK *after configuration*
It took us a long time to figure this out, and it feels like a Matlab bug, so I'm posting it here in the hopes it helps someone else.
0 Kommentare
Siehe auch
Kategorien
Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange
Produkte
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!