Which statistic is minimized in curve fitting app
2 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Dear colleagues,
The statistics which evaluate the goodness of in the Curve Fitting app are shown in the picture below.

At this link is given a description of each of these.
My question is what parameter/statistic is minimized to make a fit? In another discussion I have read something about 2-norm. So is some of shown above a 2-norm?
0 Kommentare
Antworten (2)
Torsten
am 26 Jan. 2025
Bearbeitet: Torsten
am 26 Jan. 2025
My question is what parameter/statistic is minimized to make a fit?
SSE (maybe weighted if you specify weights for the measurement data), i.e. the (weighted) 2-norm of the error.
"fminimax" tries to minimize the Inf-norm of the error.
1 Kommentar
John D'Errico
am 26 Jan. 2025
Bearbeitet: John D'Errico
am 26 Jan. 2025
The sum of squares of the residuals is mimimized. SSE effectively stands for Sum of Squares of Errors.
What is the 2-norm? Just the square root of the sum of squares. Is mimimizing the sum of squares, or the sqrt of the sum of squares different from each other? NNNNNNOOOOOOO! They are the same, in terms of the result. If you find the minimizer for the sum of squares, then it is the SAME minimizer if you minimize the sqrt of the sum of squares. But NONE of the numbers shown in that picture are the 2-norm. Again though, that is irrelevant. You CAN view the SSE as the square of the 2-norm.
Note that if you supply weights, then fit will minimize a weighted sum of squares, given the weights you supplied. Still no real difference, except that it is a weighted sum.
Finally, fit allows a robust option, even though you did not ask about robust.
A robust fit is usually performed as an iteratively reweighted recursive scheme. There the fit is done using no weights initially. Now the points with the largest residuals are downweighted by some scheme. (There are several choices for robust fitting. You will need to do some reading to decide exactly what method is used. I think the default, if you choose robust is the bi-square method.) Then the fit is again done, using the new set of weights. This operation is repeated until convergence.
12 Kommentare
Torsten
am 8 Feb. 2025
Bearbeitet: Torsten
am 8 Feb. 2025
I think this is pretty clear now.
From the web page:
A least-squares fitting method calculates model coefficients that minimize the sum of squared errors (SSE), which is also called the residual sum of squares. Given a set of n data points, the residual for the ith data point ri is calculated with the formula
ri=yi−ˆyi
where yi is the ith observed response value and ŷi is the ith fitted response value. The SSE is given by
SSE=n∑i=1r2i=n∑i=1(yi−ˆyi)2
If the response data error does not have constant variance across the values of the predictor data, the fit can be influenced by poor quality data. The weighted least-squares fitting method uses scaling factors called weights to influence the effect of a response value on the calculation of model coefficients. Use the weighted least-squares fitting method if the weights are known, or if the weights follow a particular form.
The weighted least-squares fitting method introduces weights in the formula for the SSE, which becomes
SSE=n∑i=1wi(yi−ˆyi)2
where wi are the weights.
Siehe auch
Kategorien
Mehr zu Get Started with Curve Fitting Toolbox finden Sie in Help Center und File Exchange
Produkte
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!