initial point is a local minimum. Optimization completed because the size of the gradient at the initial point is less than the default value of the function tolerance.

65 Ansichten (letzte 30 Tage)
I just found the result not the optimum and the parameters obtained are always the initial ones.
global c
xdata = [0
1
2
3
4
5
6
7
8
9
10
15
20
25
30
35
40
45
50
55
60
70
80
90
120
500
1000
10000
30000];
ydata = [0
NaN
NaN
NaN
NaN
1.34288
NaN
NaN
NaN
NaN
1.43759
NaN
NaN
1.59058
1.59786
1.60515
1.60515
1.55415
1.54687
1.58329
NaN
1.64886
1.72171
1.72171
1.72171
NaN
NaN
NaN
NaN];
xdata0 = xdata;
ydata0 = ydata;
i = ismissing(ydata);
xdata(i) = [];
ydata(i) = [];
x = xdata0(1:end-4);
y = spline(xdata,ydata,x);
plot(xdata,ydata,'o')
hold on
plot(x,y,'LineWidth',2)
figure
rng(1997)
a = 0;
b = 10;
r = a + (b-a).*rand(6,1);
% r = 10*ones(10,1);
c = r;
options = optimoptions('lsqcurvefit',...
'Algorithm','levenberg-marquardt',...
'FunctionTolerance',1e-20,...
'StepTolerance',1e-20,...
'OptimalityTolerance',1e-20,...
'FiniteDifferenceStepSize',1e-20,...
'FiniteDifferenceType','central',...
'Display','iter');
lb = [];
ub = [];
coeffs = lsqcurvefit(@fitting,r,x,y,lb,ub,options);
c = coeffs;
figure
plot(xdata,ydata,'o')
hold on
xpredict = linspace(0,max(xdata),1000);
ypredict = fitting(coeffs,xpredict);
ypred=fitting(coeffs,x);
p=0;
min=5;
for b=0:0.01:1
r = a + (b-a).*rand(6,1);
% r = 10*ones(10,1);
c = r;
options = optimoptions('lsqcurvefit',...
'Algorithm','levenberg-marquardt',...
'FunctionTolerance',1e-20,...
'StepTolerance',1e-20,...
'OptimalityTolerance',1e-20,...
'FiniteDifferenceStepSize',1e-20,...
'FiniteDifferenceType','central',...
'Display','iter');
lb = [];
ub = [];
coeffs = lsqcurvefit(@fitting,r,x,y,lb,ub,options);
c = coeffs;
figure
plot(xdata,ydata,'o')
hold on
xpredict = linspace(0,max(xdata),1000);
ypredict = fitting(coeffs,xpredict);
ypred=fitting(coeffs,x);
if sum((y-ypred).^2,2)<=min
min=sum((y-ypred).^2,2)
p=b;
end
end
b=p;
r = a + (b-a).*rand(6,1);
% r = 10*ones(10,1);
c = r;
options = optimoptions('lsqcurvefit',...
'Algorithm','levenberg-marquardt',...
'FunctionTolerance',1e-20,...
'StepTolerance',1e-20,...
'OptimalityTolerance',1e-20,...
'FiniteDifferenceStepSize',1e-20,...
'FiniteDifferenceType','central',...
'Display','iter');
lb = [];
ub = [];
coeffs = lsqcurvefit(@fitting,r,x,y,lb,ub,options);
c = coeffs;
figure
plot(xdata,ydata,'o')
hold on
xpredict = linspace(0,max(xdata),1000);
ypredict = fitting(coeffs,xpredict);
plot(xpredict,ypredict,'-x','LineWidth',2)
% xlim([0 20])
function ytotal = fitting(c,t)
tspan = t;
y0 = zeros(2,1);
[~,ys] = ode45(@myodes, tspan, y0)
ytotal = ys(:,1).*(1+c(2)) + ys(:,2) + c(3).*ys(:,2).*(1.5-ys(:,1).*(1+c(2)))./(1+c(3).*ys(:,2))
end
function dydt = myodes(t,y)
global c
dydt = zeros(2,1);
dydt(1) = c(1) .* (2.5 - (1 + c(2)) .* y(1) - y(2) - c(3) .* y(2) .* (1.5 - y(1) .* (1 + c(2))) ./ (1 + c(3) .* y(2))) .* (1.5 - y(1) .* (1 + c(2))) ./ (1 + c(3) .* y(2)) - c(4) .* y(1);
dydt(2) = c(5) .* (2.5 - (1 + c(2)) .* y(1) - y(2) - c(3) .* y(2) .* (1.5 - y(1) .* (1 + c(2))) ./ (1 + c(3) .* y(2))) .* (1.5 - y(2) - c(3) .* y(2) .* (1.5 - y(1) .* (1 + c(2))) ./ (1 + c(3) .* y(2))) - c(6) .* y(2);
end
  4 Kommentare
Jinglei
Jinglei am 30 Nov. 2022
Thank you for pointing that out. I used so many NaN values to decrease the gap make the curve more continuous. But I will still try removing them and see whether this works.@Torsten

Melden Sie sich an, um zu kommentieren.

Antworten (1)

Matt J
Matt J am 30 Nov. 2022
Bearbeitet: Matt J am 30 Nov. 2022
I suspect your FiniteDifferenceStepSize is too small. Be mindful of the guidelines for Optimizing Differential Equations.
  2 Kommentare
Jinglei
Jinglei am 30 Nov. 2022
Thank you for your answer. Do you mean I should increase the FiniteDifferenceStepSize from 1e-20 to a larger one or decrease it to get a larger range? I tried both but still can't get any good result. Values larger than 1e-20 even get to no result, showing 'local minimum possible'. Do you have any idea about that?
Matt J
Matt J am 30 Nov. 2022
'local minimum possible' means the solver might have succeeded. The exitflag would give a clearer picture of why the optimization stopped, though.

Melden Sie sich an, um zu kommentieren.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by