Fit a conic section without mirror image hyperbola
10 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I would like to fit conic sections to my data using something like the general form a.*x.^2+b.*x.*y+c.*y.^2+d.*x+e.*y+f. I have found a number of tools for this on MATLAB Central, and am currently using https://www.mathworks.com/matlabcentral/fileexchange/22683, which very often works brillianlty. One of the things I am curiuos to know is when my data are best-approximated by an ellipse and when they are best approximated by a hyperbola, so I don't want to force on or the other. When the data are ellipsiodal (or theoretically parabolic or circular, but that never happens in real life) I get exactly what I want. And when the data are hyperbolic, very often I get what I want, but sometimes the best-fitting hyperbola involves both sides of the hyperbola pair. I want to constrain the fit so that only one "mirror image" of a hyperbola can be used.
Here is an example: For the data below, I hope to get something like the blue hyperbola (just drawn by hand), but instead, I get the orange hyperbola pair as the best fit. I want to force the fit to use only one half of the pair. But I don't want to exclude the possibility of getting an ellipse as the best fit.
XY = [0, -2.975; ...
2.5, -2.975; ...
5, -2.75; ...
7.5, -2.25; ...
10, -2.41667; ...
15, -2.25; ...
20, -1.75; ...
25, -1.9; ...
30, -1.675; ...
35, -1.575; ...
40, -1.625; ...
-2.5, -3; ...
-5, -3.25; ...
-7.5, -3.375; ...
-10, -3.08333; ...
-15, -2.85; ...
-20, -2.225; ...
-25, -2.0625; ...
-30, -1.375; ...
-35, -1.5625];
Any tips?
6 Kommentare
Matt J
am 14 Jan. 2020
Bearbeitet: Matt J
am 14 Jan. 2020
It is also not immediately obvious how to determine which error is better, since we want an orthogonal regression value, not a minimization in either Y or X.
Then your question is really, "how do I do orthogonal regression for a 1-sided hyperbola"? If you have an orthogonal regression routine for ellipses, then once you have one for hyperbola, my answer applies.
the answer you provided only fits hyperbolas that "open" up or down, rather than to the left or right, which is quite possible.
No, it will fit an arbitrarily oriented hyperbola. In its current form it is not an orthogonal regression, though. Do you know for certain that the Taubin algorithm you are currently using from the File Exchange is orthogonal regression? I don't think it could be, because it claims to be non-iterative. I don't think it's possible to do orthogonal regression of conics non-iteratively. Are you sure you need a fit as robust as an orthogonal regression? It's a lot of extra effort...
Antworten (2)
Matt J
am 14 Jan. 2020
Bearbeitet: Matt J
am 14 Jan. 2020
This approach uses John's fminspleas FEX submission (Download). Although the fit is, technically, a hyperbola, it diverges to a piecewise linear fit here when you don't impose some lower bound on the inter-focal distance. Apparently, there is no curvature suggested by your data, so the curve fitting routine's iterations try to make the hyperbola as sharply pointed as possible.
lbound=0;
tmin=fminsearch(@(t)thetaCost(t,XY,lbound),0);
[fitError,xy0]=thetaCost(tmin,XY,lbound);
lbound=10;
tmin=fminsearch(@(t)thetaCost(t,XY,lbound),0);
[fitError,xy5]=thetaCost(tmin,XY,lbound);
plot(XY(:,1),XY(:,2) ,'o',xy0(:,1),xy0(:,2),'r.-',xy5(:,1),xy5(:,2),'k.-');
legend('Raw Data', 'Inter-Focal Distance >=0','Inter-Focal Distance >=10')
function [r,xy,p]=thetaCost(theta,XY,lbound)
R=[cosd(theta), -sind(theta); sind(theta),cosd(theta)];
xy=XY*R;
x=xy(:,1); y=xy(:,2);
[~,idx]=min(x);
p0=[x(idx),y(idx)];
flist={@(p,xd) sqrt(((xd-p(1))/p(2)).^2+1),1};
[p,AD]=fminspleas(flist,p0,x,y,[-inf,lbound/2]);
A=AD(1);D=AD(2);
x0=p(1);a=p(2);
yfit=@(x) A*sqrt(((x-x0)/a).^2+1)+D;
r=norm( yfit(x) - y,1);
if nargout>1
x=linspace(min(x),max(x),1000).';
xy=[x,yfit(x)]*R.';
p=[theta, A,x0,a,D];
end
end
2 Kommentare
John D'Errico
am 14 Jan. 2020
Bearbeitet: John D'Errico
am 14 Jan. 2020
This goes back to basic algebra. When does a conic section describe an ellipse versus a hyperbola?
Compute the discriminant, thus b^2 - 4*a*c. If it is negative, then the result is an ellipse, if positive, a hyperbola.
The point being, while the unknowns in the conic appear to be essentially linear, the choice of which specific form you get is a nonlinear thing. There are some special cases where you can get a circle, or a parabola.
Next, if you just want one of the branches? That enters in a nonlinear way too. You can probably think of it as a choice of whether the positive or negative branch of a sqrt is taken.
Effectively all of that makes the choices you want to be made in some automatic way not so trivial. The basic tools you will find to do this are not that intelligent, as to allow you to make those choices.
There are always things we want. Achieving our goals often takes work.
Siehe auch
Kategorien
Mehr zu Regression finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!