How to use implicit model functions in Curve fitting toolbox

18 Ansichten (letzte 30 Tage)
kdb acml
kdb acml am 11 Nov. 2020
Bearbeitet: kdb acml am 16 Nov. 2020
I am trying to fit some data on a custom implicit equation (2nd degree polynomial in x and y; see below) and obtain the coefficients (a,b and c) for that data.
But the cftool does not have an implicit option in its custom equation tab.
Any help is appreciated on the topic. I understand that people are coding it in some ways and i would really appriciate if a GUI based answer is also possible.
  2 Kommentare
Matt J
Matt J am 12 Nov. 2020
You need some sort of constraint on a,b,c because otherwise a trivial solution is just a=b=c=0.
kdb acml
kdb acml am 12 Nov. 2020
Yes. the constraints are that c is 1.0; and non-zero values for a and b

Melden Sie sich an, um zu kommentieren.

Akzeptierte Antwort

Ameer Hamza
Ameer Hamza am 11 Nov. 2020
Bearbeitet: Ameer Hamza am 11 Nov. 2020
You can consider the equation like this
where . Then define x, y, and z in MATLAB code like this
x; y; % your data pounts
z = zeros(size(x));
and then use cftool() with x, y, and z variables. And in custom equation, write
a*x^2+b*y^2+c*y
  7 Kommentare
Ameer Hamza
Ameer Hamza am 12 Nov. 2020
If you load the data like this
data = load('new.txt');
x = data(:,1);
y = data(:,2);
z = zeros(size(x));
and use cftool to fit the model, then export the model to base workspace and then run the following
f = @(x,y) fittedmodel.a/fittedmodel.b*x.^2 + y.^2 + fittedmodel.c/fittedmodel.b*y;
fimplicit(f, [min(x), max(x) min(y) 0])
hold on
plot(x, y, '+')
You will get
The actual fit will depend on the initial condition.
kdb acml
kdb acml am 13 Nov. 2020
Bearbeitet: kdb acml am 13 Nov. 2020
Thank You.
It seems I was just confused by the plot in cftool. I am getting the coefficients now. Again; thanks a lot.
All the 3 methos suggested in the 3 comments are working with some variance in the results.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (2)

Matt J
Matt J am 12 Nov. 2020
Bearbeitet: Matt J am 13 Nov. 2020
An analytical solution is possible here,
[x,y]=deal(x(:),y(:));
A=[x.^2,y.^2,y];
s=std(A,1);
[~,~,V]=svd(A./s,0);
abc=V(:,end)./s(:);
abc=abc/abc(end);
  6 Kommentare
Bruno Luong
Bruno Luong am 13 Nov. 2020
Bearbeitet: Bruno Luong am 13 Nov. 2020
Better indeed. I would argue this is even "more" correct (increase the noise s to 0.03 you'll see the effect, full code tlsqr.m attached).
[x,y]=deal(x(:),y(:));
A=[x.^2,y.^2,y];
C=cov(A);
S=sqrtm(C);
S=diag(diag(S));
% your method is equivalent to use
% S=diag(std(A)) % or
% S=diag(sqrt(diag(C)));
[~,~,V]=svd(A/S,0);
abc=S\V(:,end);
abc=abc/abc(end);
The method is still bother me great time, since the scalling affects both signal spread and noise spread. The noise in A is no longer Gaussian since it's not linear. etc...
kdb acml
kdb acml am 16 Nov. 2020
Bearbeitet: kdb acml am 16 Nov. 2020
Thanks !
The variance of the coefficients using all the 3 methods is small for my data atm. So now I just need to remvoe some statistical errors before trying to find the actual differences in these methods.

Melden Sie sich an, um zu kommentieren.


Bruno Luong
Bruno Luong am 12 Nov. 2020
Bearbeitet: Bruno Luong am 12 Nov. 2020
Why not just solving using the linear algebra, seem this straighforward method does the job
xy = load('new.txt');
x = xy(:,1);
y = xy(:,2);
P = -([x.^2,y.^2] \ y);
a = P(1);
b = P(2);
c = 1;
% Plot
xi = linspace(min(x),max(x));
yi = linspace(min(y),max(y)).';
z = reshape(a*xi.^2+b*yi.^2+c*yi,[length(yi) length(xi)]);
close all
figure
contour(xi,yi,z,[0 0],'r');
hold on
plot(x,y,'.')
legend('fit','data')
  5 Kommentare
Bruno Luong
Bruno Luong am 13 Nov. 2020
Bearbeitet: Bruno Luong am 13 Nov. 2020
Vast topic. All literature of regression deals somewhat with noise issue. For starter you can look at noise covariance matrix (normalized Gaussian noise), regularization technique: art of find the right balance between bias - systematic error - and statistics error - which depends on the inverse of the Hessian at the minimum, meaning the way to formulate the objective function to achieve the regression goal.
In practice noise sometime is not Gaussian, and it's a big mess even at the theoretical level.
If you really want to fire againts bias, you need first to understand/characterize your measurement noise. If you start to says "I do not know what noice characteristics means" then you have a long long way to go.
The total-variation-lsq try to deal with problem where the error is affected both measurement query points and values, where Matt try to inspire from by using SVD, but his nice idea - sorry to say - is flawed for various reasons.
EDIT: Matt tlsq method is much better after data "normalization".
kdb acml
kdb acml am 16 Nov. 2020
Thanks ! I'll look these up in my free time !

Melden Sie sich an, um zu kommentieren.

Produkte


Version

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by