Least squares fit to multiple differently-sized data sets simultaneously
11 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Richard Cobley
am 12 Aug. 2016
Kommentiert: Riccardo Marinelli
am 5 Nov. 2022
I can use lsqcurvefit to simultaneously fit a system of non-linear equations using matrix arguments. However, I think the matrix requirement to have every element defined means that all data sets must be the same size? I can't find a way around that.
To make it clearer, if I make up some simple data sets which are the same size:
x=[1 3 5 7; 5 7 9 11; 9 11 13 15];
y=[10 11 12 13; 10 11 12 13; 10 11 12 13];
and want to fit each with a straight line, where they all share the same gradient (and so the solver must simulatenously fit that across all three data sets, but fit the individual offset to each set separately):
%variables v(1:4) are m, c1, c2, c3
fn = @(v,xdata)[v(1).*xdata(1,:) + v(2); v(1).*xdata(2,:) + v(3); v(1).*xdata(3,:) + v(4)];
I can set up a guess and solve using:
x0 = [1; 10; 9; 8];
fitvars = lsqcurvefit(fn,x0,x,y);
The problem comes when the data sets don't have the same number of points - if the second row of xdata and ydata only had three points, how do I pass the data to lsqcurvefit?
I've tried padding the matrices with NaN but then the objective function returns undefined numbers and the solver complains. I tried converting the data and the function to cell arrays to allow different lengths, but lsqcurvefit won't take it.
I'm aware this particular example can be solved trivially, but the real functions and data sets are non-linear (which explains why I'm using lsqcurvefit) and the system has a minimum of five data sets to solve with one shared variable between each function, as well as several unique variables and constants to each.
Thanks for any help.
0 Kommentare
Akzeptierte Antwort
John D'Errico
am 2 Sep. 2016
Bearbeitet: John D'Errico
am 2 Sep. 2016
Yes. There is a way. Simply store the data in a cell array. Make the vectors COLUMN vectors. I was lazy here when I did that, using transposes.
x={[1 3 5 7]' [5 7 9 11]' [9 11 13 15]'};
y={[10 11 12 13]' [10 11 12 13]' [10 11 12 13]'};
yy = vertcat(y{;});
%variables v(1:4) are m, c1, c2, c3
fn = @(v,xdata) [v(1).*xdata{1} + v(2); v(1).*xdata{2} + v(3); v(1).*xdata(3,:) + v(4)];
x0 = [1; 10; 9; 8];
fitvars = lsqcurvefit(fn,x0,x,yy);
5 Kommentare
John D'Errico
am 4 Nov. 2022
There are so many unspoken things in your question that I cannot easily answer you. Is this a LINEAR regression? Or a nonlinear regression, as the original question asked?
Do you really mean to compute the weights as you did?
weight1 = 1./(err1.^1); weight2 = 1./(err2.^2);
Note that weight1 only uses err1.^1, yet you are squaring the weights for weight2. Was that intentional, or a typo?
The original question was a complicated one, where some of the parameters are shared between models, but you did not seem to indicate that.
In fact, it looks like all you are asking to do is a weighted linear regression, perhaps of the form y=a*x+b, and to do it several times. As such, you could just use a loop, calling the function lscov each time, for each subproblem. (lscov is perhaps the simplest tool to perform a weighted regression, that I know you will have because it is not part of any toolbox.) Could you do that for all of the unknowns at once, in one big call? Well, yes, but that is surely more effort than you need, and unless you understand how to use sparse matrices, it would be inefficient.
Riccardo Marinelli
am 5 Nov. 2022
My situation was this: I needed to perform a simultaneous fit of seven curves with shared parameters since I didn't possess enough data to do many fits. I also desired to weight the fit as it is usually done on the squared inverse of the uncertainties
I found the aweseome custom function nlinmultifit() that allows me also to obtain easily the confidence interval of the fits, so yesterday I stuck with this function.
If you are interested, here is what I obtained with some made up data to make a test:
Weitere Antworten (1)
Qu Cao
am 24 Aug. 2016
The data sets should have the same number of points. The 'xdata' and 'ydata' should be well defined vetors or matrices.
Siehe auch
Kategorien
Mehr zu Linear and Nonlinear Regression finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!