Filter löschen
Filter löschen

Neural network result errors seem to be random (not reproducible)

3 Ansichten (letzte 30 Tage)
I'm wondering why my neural network ends up with a different root mean square error every time I run it (starting with an empty workspace). I divide the data set with "divideind", so there is no randomness in choosing the training, validation and testing sets. The inputSeries contains five input variables and the targetSeries contains one output variable. Each variable contains a time series of 8784 values. Is there some randomness in the training function?
The code is the following:
inputSeries = pvInputs;
targetSeries = pvOutputs;
% Create a Nonlinear Autoregressive Network with External Input
inputDelays = 1:4; % Delay of 4 hours
feedbackDelays = 1:24; % Delay of 4 hours
hiddenLayerSize = 12;
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize);
% Choose Input and Feedback Pre/Post-Processing Functions
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% Prepare the Data for Training and Simulation
[inputs,inputStates,layerStates,targets] = preparets(net,inputSeries,{},targetSeries);
% Setup Division of Data for Training, Validation, Testing
net.divideFcn = 'divideind';
net.divideParam.trainInd = 1:5112;
net.divideParam.valInd = 5113:7320;
net.divideParam.testInd = 7321:8784;
% Choose a Training Function
net.trainFcn = 'trainlm'; % Levenberg-Marquardt
% Choose a Performance Function
net.performFcn = 'mse'; % Mean squared error
% Train the Network
[net,tr] = train(net,inputs,targets,inputStates,layerStates);
% Test the Network
outputs = net(inputs,inputStates,layerStates);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs);
% Correct Negative Values
outputs = num2cell(max(cell2mat(outputs),0));
% Calculate Root Mean Square Error (RMSE)
rmse = sqrt(mean(((cell2mat(targets(7321:8760))'- cell2mat(outputs(7321:8760))')./max(cell2mat(outputs))).^2));
er_ARX = rmse*100; % in per cent

Akzeptierte Antwort

Shashank Prasanna
Shashank Prasanna am 14 Nov. 2013
Bearbeitet: Shashank Prasanna am 14 Nov. 2013
Weights are initialized randomly. If you want to reproduce the results, one way is to reset the random seed each time you run the code.
Include the following line at the beginning of the script:
>> rng('default')
You will get the same results each run.
  4 Kommentare
SB
SB am 25 Nov. 2013
Thank you for your help. I found a way to set the initial weights of the inputs manually. In case someone struggles with the same problem, this is a possible solution:
% Initialize Weights
for i = 1:net.numLayers % i = 1:2
net.layers{i}.initFcn = 'initwb';
for j = 1:net.numLayers % j = 1:2
if net.layerConnect(i,j) == 1
net.layerWeights{i,j}.initFcn = 'initzero';
end
end
end
for k = 1:net.numInputs % k = 1:2
net.inputWeights{1,k}.initFcn = 'initzero';
end
net = configure(net,inputs,targets);
net = init(net);
for l = 1:net.numInputs % l = 1:2
net.iw{1,l} = ones(hiddenLayerSize,length(net.inputWeights{1,l}.delays)*net.inputs{l}.size);
end
Harry Smith
Harry Smith am 28 Nov. 2017
This may also help if your trying to seed a net
rng('default')
net1 = setwb(net1,seedGenes);
[net1,tr] = train(net1,inputs,outputs)

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (1)

Greg Heath
Greg Heath am 29 Nov. 2017
1. Weights are different for every run because training is beginning at a different state of the random number generator. If you want to repeat a previous run you need to save the state of the rng before each design.
2. For examples search the NEWSGROUP and/or ANSWERS using
greg rng narxnet
3. Are feedback delays really 1:24 or is that a typo?
4. Is there a specific significance in your data division ratios of 58/25/17 ?
5. If your outputs should be nonnegative use LOGSIG as the output function.
6. You can automatically obtain outputs, errors, final input and feedback delays from the same train command. See my examples for details.
Hope this helps
Thank you for formally accepting my answer
Greg

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by