For loop defining the network architecture
4 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Jingyuan Yao
am 5 Jul. 2021
Kommentiert: Jingyuan Yao
am 15 Jul. 2021
% Solve an Input-Output Fitting problem with a Neural Network
% Script generated by Neural Fitting app
% Created 26-Apr-2020 11:07:28
%
% This script assumes these variables are defined:
%
% Einleger_Binaer_Alle_inv_sortiert - input data.
% Auslenkung_Alle_inv - target data.
x = Einleger_Binaer_Alle_inv_Sortiert;
t = Auslenkung_Alle_inv;
rng('default');
% Choose a Training Function
% For a list of all training functions type: help nntrain
% 'trainlm' is usually fastest.
% 'trainbr' takes longer but may be better for challenging problems.
% 'trainscg' uses less memory. Suitable in low memory situations.
trainFcn = 'trainlm'; % nicht Levenberg-Marquardt backpropagation, da schneller
% Create a Fitting Network
% hiddenLayerSize = 10;
% net_hiddenlayersize6_sortiert = fitnet(hiddenLayerSize,trainFcn);
% For more hidden layers, layer construction
hiddenLayer1Size = 10;
hiddenLayer2Size = 10;
net = fitnet([hiddenLayer1Size hiddenLayer2Size], trainFcn);
Hi
Now I have a neural network with two hidden layers. i want to expand it for more hidden layers.
Can I use a for-loop for layer construction? And if it is possible, how can I write it?
0 Kommentare
Akzeptierte Antwort
Bhavya Chopra
am 8 Jul. 2021
I understand that you want to create a neural network with multiple hidden layers instantiated using a for loop. The fitnet function can be provided a vector of hidden layer sizes. Assuming the same size for each hidden layer:
trainFcn = 'trainlm';
n = 5; % Number of hidden layers
s = 10; % Size of each hidden layer
layerSizes = ones(1, n)*s % Creating vector with layer sizes
net = fitnet(layerSizes, trainFcn);
Alternatively, if the layers have varying sizes, for instance in reducing powers of two:
trainFcn = 'trainlm';
n = 5; % Number of hidden layers
layerSizes = zeros(1,n);
for i = 1:n
layerSizes(i) = 2^(n-i+1);
end
net = fitnet(layerSizes, trainFcn);
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!