Difference between fittype 'poly1' and 'A*x+B'?

79 Ansichten (letzte 30 Tage)
Georg Winkens
Georg Winkens am 13 Okt. 2021
Kommentiert: Jason Thomas am 19 Mär. 2024
Hello,
I have the following minimum working example to illustrate my problem:
x=[1:1:6]'; %Datapoints in x
y=[320;310;290;250;220;190].*10^9; %Datapoints in y
[xData,yData] = prepareCurveData(x,y);
fit1 = fittype('poly1'); %the suggested polynomial of 1st degree
fit2 = fittype('A*x+B'); %a manually entered polynomial of 1st degree
%now fit both fittypes
fitresult1 = fit(xData,yData,fit1);
fitresult2 = fit(xData,yData,fit2);
While the fit of 'poly1' yields a reasonable result; the fit using the formula for a linear function produces nonsense. And these nonsense values change every time I run the code.
fitresult1 =
Linear model Poly1:
fitresult1(x) = p1*x + p2
Coefficients (with 95% confidence bounds):
p1 = -2.743e+10 (-3.379e+10, -2.107e+10)
p2 = 3.593e+11 (3.346e+11, 3.841e+11)
fitresult2 =
General model:
fitresult2(x) = A*x+B
Coefficients (with 95% confidence bounds):
A = 0.03445
B = 0.4387
If I don't use the powers of ten, both fits yield the same result:
y2 = [320;310;290;250;220;190];
fitresult2 =
General model:
fitresult2(x) = A*x+B
Coefficients (with 95% confidence bounds):
A = -27.43 (-33.79, -21.07)
B = 359.3 (334.6, 384.1)
Where is the difference between both methods? For smaller powers of 10 (up to .*10^5), both methods yield identical results.
Thanks everyone in advance!
  1 Kommentar
Jason Thomas
Jason Thomas am 19 Mär. 2024
Hey Georg, I realize it's been a few years, but how did you get your error bars for the slope of your fit?
E.G. you have the result: p1 = -2.743e+10 (-3.379e+10, -2.107e+10)
but I just get: p1 = -2.743e+10

Melden Sie sich an, um zu kommentieren.

Antworten (1)

John D'Errico
John D'Errico am 13 Okt. 2021
When you use the 'poly1' model, FIT is probably smart enough to understand this is a LINEAR model. And that it is solvable using simple linear regression methods. This is a solution that will not require iterative methods. There will not be starting values needed at all. And the nice thing is even with those powers of 10 applied, a simple LINEAR regression is able to handle the problem.
However, when you use a specified model, how is MATLAB going to be smart enough to take your model string apart, look at the parameters, and then decide the model is just the equivalent of 'poly1'? Not gonna happen, at least not for a while. I certainly would not want to write the code to take all models apart, parsing them, then have fit come to some intelligent understanding about the model. A big problem here is that it will just slow down the code, just in case someone does something palpably silly, where they don't understand how the modeling is done.
Therefore, FIT is now forced to use the same scheme it uses for a nonlinear optimization. It requires a pair of starting values. When you fail to provide them, then fit assumes RANDOM starting coefficients, and then tries to optimize them. The problem is, your data is so far out in the weeds when those powers of 10 are on the problem, that FIT gets lost, becaue it starts with complete crap for starting values. And then you get some random crap result. And every time you run it, since new random starting values are provided, you get some different random crap model.

Kategorien

Mehr zu Linear and Nonlinear Regression finden Sie in Help Center und File Exchange

Produkte


Version

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by