Filter löschen
Filter löschen

backpropogation ,Multilayer perceptron,neural network

1 Ansicht (letzte 30 Tage)
rajesh yakkundimath
rajesh yakkundimath am 29 Dez. 2011
dear sir,
i m attaching a matlab code in which i tried to train the network using Feed forward Backpropogation.Here i m finding difficulty in in instruction
net_FFBP = createNet(inputsize, mimax, hneurons, fcnCELL, initflag, trainalgo, paramatrix, sameWEIGHT);
can i get how to save parameters in net_FFBP.I have attached the code below
function TrainingNet
load Feature.txt; %load the features
FeatureS = Feature'; %Convert to column array
load Outtype.txt; %load output type
OuttypeS = Outtype';
inputsize = size(FeatureS, 1);
min_data = min(min(FeatureS));
max_data = max(max(FeatureS));
mimax = [min_data max_data];
hneurons = 2000;
%initialize parameters for creating the MLP.
fcnCELL = {'logsig' 'logsig'};
initflag = [0 1];
trainalgo = 'gdm';
paramatrix = [10000 50 0.9 0.6]; % epochs = 100, show = 50, learning rate = 0.9, momentum term = 0.6
sameWEIGHT = [];
net_FFBP = creteNet(inputsize, mimax, hneurons, fcnCELL, initflag, trainalgo, paramatrix, sameWEIGHT);
net_FFBP = newff(FeatureS, OuttypeS, 39);
[net_FFBP] = train(net_FFBP, FeatureS, OuttypeS);
save net_FFBP net_FFBP;
disp('Done: Training Network');

Akzeptierte Antwort

Greg Heath
Greg Heath am 29 Dez. 2011
% function TrainingNet
% load Feature.txt; %load the features
% FeatureS = Feature'; %Convert to column array
% load Outtype.txt; %load output type
% OuttypeS = Outtype';
[I N ] = size(FeatureS)
[O N ] = size(Outtypes)
minmaxF = minmax(FeatureS) % Is a matrix [I 2]
Neq = N*O % Number of training equations
% I-H-O node topology
% Nw = (I+1)*H+(H+1)*O % Number of unknown weights
% Want Neq >> Nw or % H << Hub
Hub = (Neq-O)/(I+O+1) % Neq = Nw
r = 10 % Neq > r*Nw, ~2 < r < ~30
H = floor((Neq/r-O)/(I+O+1))
How did you get H = 2000 ???
% %initialize parameters for creating the MLP.
% fcnCELL = {'logsig' 'logsig'};
% initflag = [0 1];
What does initflag do?
% trainalgo = 'gdm';
% paramatrix = [10000 50 0.9 0.6]; % epochs = 100, show = 50,
100 or 10,000?
% learning rate = 0.9, momentum term = 0.6
% sameWEIGHT = [];
I suggest first using the defaults in NEWFF
% net_FFBP = creteNet(inputsize, mimax, hneurons, fcnCELL, initflag, trainalgo, paramatrix, sameWEIGHT);
Is this supposed to be a replacement for NEWFF and net.Param.* ??
% net_FFBP = newff(FeatureS, OuttypeS, 39);
Now H = 39 ??
% [net_FFBP] = train(net_FFBP, FeatureS, OuttypeS);
% save net_FFBP net_FFBP;
% disp('Done: Training Network');
What is your question ??
Greg

Weitere Antworten (0)

Kategorien

Mehr zu Predictive Maintenance Toolbox finden Sie in Help Center und File Exchange

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by