# Comparing two curve fits (using AIC?)

29 Ansichten (letzte 30 Tage)
James Akula am 12 Jan. 2023
Kommentiert: Bjorn Gustavsson am 18 Jan. 2023
Hello all,
Let's say for a moment I have some a priori ideas about a family of functions that might best describe a particular data set. After I fit the data with each candidate, I can simply look at the outputs (e.g., RMSE, r²) and chose the one with the best values. However, is there a way to gain some degree of confidence about that selection? For example, let's say I have reason to believe that in a set of data like the below, the true distribution is described by a low-order integer exponent in the function y=b×xⁿ+c. So, I might capture some data and then try to fit it as follows:
% Generate some data
b = 1; % True coefficient
n = 3; % True exponent
c = 0; % True y-intercept
rx = randi(100, [100, 1])/10;
rn = 3 + rand([100, 1])/4 - 0.125;
rc = (rand([100, 1]) * 500) - 250 + c;
ry = b * rx.^(rn) + rc;
% Generate and fit the models
m = @(b, c, n, x) b * x.^n + c;
m1 = @(p, x) m(p(1), p(2), 1, x); % n=1
m2 = @(p, x) m(p(1), p(2), 2, x); % n=2
m3 = @(p, x) m(p(1), p(2), 3, x); % n=3
c1 = lsqcurvefit(m1, [1, 0], rx, ry)
c2 = lsqcurvefit(m2, [1, 0], rx, ry)
c3 = lsqcurvefit(m3, [1, 0], rx, ry)
% Plot the results
x = linspace(min(rx), max(rx));
plot(rx, ry, 'ok', x, m1(c1, x), '-g', x, m2(c2, x), '-r', x, m3(c3, x), '-b')
% Determine which model is most likely to be correct. Use AIC?
In the above example, m3 should generally fit the data best, unless the random number generation is very unlucky, because that's what I set it to.
For those who are familiar with Prism, there I might perform this test by starting an "analysis" and chosing compare, and then selecting "for each data set, which of two equations (models) fits best" and then chosing the "Akaike's Information Criterion" test. After running that on one set of data, I got
This is very helpful as it lets me know how certain I am about the fit choice (i.e., 86% sure n=3 is better than n=2). What's the best way to do something similar in MATLAB?
##### 0 Kommentare-2 ältere Kommentare anzeigen-2 ältere Kommentare ausblenden

Melden Sie sich an, um zu kommentieren.

### Akzeptierte Antwort

James Akula am 13 Jan. 2023
I belive the following functions perform the AIC test, as shown in https://www.mathworks.com/matlabcentral/answers/1893055-comparing-two-curve-fits-using-aic#comment_2561700. p1 is the probablility that the first model is the correct one.
% y = y data
% o = observed
% a = "answer" to curve fit
% m = modeled
% 1/2 = model IDs
df = @(oy, a) numel(oy) - numel(a);
SS = @(my, oy) sum(((my - oy).^2));
DifAIC = @(m1y, m2y, oy, a1, a2) numel(oy) .* ...
log(SS(m1y, oy)./SS(m2y, oy)) + 2 .* (df(oy, a2) - df(oy, a1));
p1 = @(m1y, m2y, oy, a1, a2) ...
1 - exp(DifAIC(m1y, m2y, oy, a1, a2)/2)./...
(1 + exp(DifAIC(m1y, m2y, oy, a1, a2)/2));
##### 1 Kommentar-1 ältere Kommentare anzeigen-1 ältere Kommentare ausblenden
James Akula am 13 Jan. 2023

Melden Sie sich an, um zu kommentieren.

### Weitere Antworten (1)

Bora Eryilmaz am 12 Jan. 2023
The lsqcurvefit returns the residual norm of the fit. The model that has the least residual norm would be the "best" one by this criterion:
% Generate some data
b = 1; % True coefficient
n = 3; % True exponent
c = 0; % True y-intercept
rx = randi(100, [100, 1])/10;
rn = 3 + rand([100, 1])/4 - 0.125;
rc = (rand([100, 1]) * 500) - 250 + c;
ry = b * rx.^(rn) + rc;
% Generate and fit the models
m = @(b, c, n, x) b * x.^n + c;
m1 = @(p, x) m(p(1), p(2), 1, x); % n=1
m2 = @(p, x) m(p(1), p(2), 2, x); % n=2
m3 = @(p, x) m(p(1), p(2), 3, x); % n=3
[c1, resnorm1] = lsqcurvefit(m1, [1, 0], rx, ry);
Local minimum possible. lsqcurvefit stopped because the final change in the sum of squares relative to its initial value is less than the value of the function tolerance.
[c2, resnorm2] = lsqcurvefit(m2, [1, 0], rx, ry);
Local minimum possible. lsqcurvefit stopped because the final change in the sum of squares relative to its initial value is less than the value of the function tolerance.
[c3, resnorm3] = lsqcurvefit(m3, [1, 0], rx, ry);
Local minimum possible. lsqcurvefit stopped because the final change in the sum of squares relative to its initial value is less than the value of the function tolerance.
% Plot the results
x = linspace(min(rx), max(rx));
plot(rx, ry, 'ok', x, m1(c1, x), '-g', x, m2(c2, x), '-r', x, m3(c3, x), '-b')
% Determine which model is most likely to be correct. Use AIC?
[~, idx] = min([resnorm1 resnorm2 resnorm3]) % Index of best model
idx = 3
##### 8 Kommentare6 ältere Kommentare anzeigen6 ältere Kommentare ausblenden
James Akula am 17 Jan. 2023
I hear ya. But I do feel like if you have a priori reasons to think something is either one thing or another, and you are hopitn the data will guide you to the truth, then this approach perhaps makes sense.
Bjorn Gustavsson am 18 Jan. 2023
@James Akula, sure, and my suggestion might work well enough for this (with reasonable certainty) simplified toy-example, but can run into severe problems in more complicated real-world cases. I mainly thought it was worth mentioning as a reminder for the case where someone were to have this specific choise to make. (and I don't need anyone to argue this quandry with - I come down on all three of the two sides in this discussion and have frequent and angry arguments with myself from last week...)

Melden Sie sich an, um zu kommentieren.

### Kategorien

Mehr zu Support Vector Machine Regression finden Sie in Help Center und File Exchange

R2022b

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by