Neural Networks warning?

7 Ansichten (letzte 30 Tage)
Mohamed Abdelsamie
Mohamed Abdelsamie am 9 Mär. 2019
Kommentiert: Walter Roberson am 5 Dez. 2020
Hi,
When I train any neural network i get the warning below. It still trains usable networks but I'd like to know what the warning means.
% Warning: 'trainRation' is not a legal parameter.
% > In nntest.param>do_test (line 63)
% In nntest.param (line 6)
% In network/subsasgn>setDivideParam (line 1838)
% In network/subsasgn>network_subsasgn (line 460)
% In network/subsasgn (line 14)
% In NN_Training (line 78)
I'm using the function below to train the networks but I don't know why trainRation is causing the warning.
net = fitnet(current_neuron_count, TRAIN_FCN);
net.divideParam.trainRation = 70/100;
net.divideParam.valRation = 15/100;
net.divideParam.testRation = 15/100;
Thanks

Akzeptierte Antwort

Walter Roberson
Walter Roberson am 9 Mär. 2019
trainRatio, valRatio, testRatio
no final 'n'. Not trainRation, trainRatio
  3 Kommentare
Walter Roberson
Walter Roberson am 9 Mär. 2019
The versions with 'Ration' would have had those commands ignored, leaving you with the default ratios.
Mohamed Abdelsamie
Mohamed Abdelsamie am 9 Mär. 2019
Thanks a lot Walter!

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (1)

alsharif taha
alsharif taha am 5 Dez. 2020
when i train this network i get errors
please help me
clc
clear
close all
p=[1:10 10:10:100];
t= (p.^2);
net=newff(p,t,[3], {'logsig' 'purelin'});
net.divideParam.trainRatio=1;
net.divideParam.testRatio=0;
net.divideParam.valRatio=0;
net.divideParam.lr=0.01;
net.divideParam.min_grad=1e-20;
net.divideParam.goal=1e-30;
net.divideParam.epochs=300;
net=train(net,p,t);
plot([1:100] .^2,'x')
hold on
plot(round(net(1:100)),'o')
plot(p,t, '*g')
legend('real target', 'output of net', 'training samples', 'location', 'north west')
the error msgs are:
Warning: 'min_grad' is not a legal parameter.
Warning: 'min_grad' is not a legal parameter.
Warning: 'min_grad' is not a legal parameter.
although i defined the epochs to 300 while training continues to reach 1000 epochs
i do not know why ? pls help me
  1 Kommentar
Walter Roberson
Walter Roberson am 5 Dez. 2020
min_grad is for https://www.mathworks.com/help/deeplearning/ref/traingdx.html not for divideParam

Melden Sie sich an, um zu kommentieren.

Produkte


Version

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by