- ‘fitnet’ : It is a high level function that provides a convenient way to create and train a neural network with a single function call.
- ‘network’: This can be used to create customized shallow neural networks, it is a low level function that provides more control over the network architecture, and we can set the ‘trainingOptions’ in detail, so more work is required to setup and train network compared to ‘fitnet’
Shallow neural network parameters for training
7 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I want to build a shallow network with two hidden layers. I am quite confused between fitnet, net, network, and other commands.
I can reach a result, however, I can not modify it.
For example, the learning rate or activation functions do not change.
And the final question, is it possible to add mini-batch learning to gradient descent with MATLAB?
trainFcn = 'traingd';
net.numLayers = 2
hiddenLayerSize = [6 36];
net.trainParam.lr = 0.05;
net.layers{1}.transferFcn = 'tansig';
net.layers{2}.transferFcn = 'logsig';
% net.layers{1}.transferFcn = 'reluLayer';
net = fitnet(hiddenLayerSize,trainFcn);
0 Kommentare
Antworten (1)
Sanjana
am 3 Mär. 2023
Hi Mohammad,
In MATLAB, ‘fitnet’ and ‘network’ are both functions that can be used to create and train neural networks, but they differ in their flexibility and ease of use.
Please, refer to the following documentations for further details,
Please, refer to the following documentation, to understand how to edit the Shallow neural network properties:
https://www.mathworks.com/help/deeplearning/ug/create-and-train-custom-neural-network-architectures.html
As for the learning rate, you can specify it using ‘trainingOptions’ object, refer to the following example,
Example:
% Create a shallow neural network with one hidden layer
net = feedforwardnet([10]);
% Set the training options
options = trainingOptions('trainlm', 'MaxEpochs', 50, 'LearningRate',0.01);
% Train the neural network using the specified options
net = train(net, inputs, targets, options);
And, it is not possible to use mini-batch learning with the ‘traingd’ Optimizer. The 'traingd' optimizer is a batch gradient descent algorithm that updates the network weights using the gradients computed over the entire training dataset.But,it is possible to use mini-batch learning with other optimizers when working with shallow neural networks in MATLAB. The ‘train’ function supports mini-batch learning with the 'trainscg' and 'traingdm' optimizers, which are both based on gradient descent.
Hope this helps!
Siehe auch
Kategorien
Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!