Borrar filtros
Borrar filtros

Curve fitting for non-linear data

5 visualizaciones (últimos 30 días)
Julian
Julian el 11 de Oct. de 2018
Comentada: Alex Sha el 29 de En. de 2022
I am trying to fit some data using lsqcurvefit in MATLAB but I am fairly new to this area.
xdata1 = [0 60 660 1250];
ydata1 = [0 18 23 31];
In the image below, the red line is the fit I want to achieve. Sadly, Polyfit does not provide suitable results.
How can I achieve this fit? Thank you in advance!

Respuesta aceptada

Matt J
Matt J el 11 de Oct. de 2018
I believe piece-wise linear fitting was in the scope of Bruno's free-knot spline fitting package,
  1 comentario
Julian
Julian el 12 de Oct. de 2018
Thank you Matt I will look into this!

Iniciar sesión para comentar.

Más respuestas (2)

Chaoyu Zhang
Chaoyu Zhang el 11 de Oct. de 2018
Editada: Chaoyu Zhang el 15 de Oct. de 2018
You can use the method described below,
The target equation (3rd order or maybe higher) is
y = a*x.^3 + b*x.^2 + c*x + d;
A * p = y;
p is the parameters of the equation,
p = [a;b;c;d]
A is the matrix made of x.^3,x.^2,x,1,
A = [x(1).^3 x(1).^2 x(1) 1; ... ; x(n).^3 x(n).^2 x(n) 1]
y is the vector made of y,
y = [y(1); ... ;y(n)];
p = (A.'*A)^(-1)*A.'*y;
Now you get the parameters you need.
  5 comentarios
Matt J
Matt J el 15 de Oct. de 2018
Editada: Matt J el 15 de Oct. de 2018
Well, the poorer accuracy comes from the inversion of (A.'*A), since cond(A.'*A) is the square of cond(A).
>> A=rand(1000,100);
>> cond(A.'*A)
ans =
632.4462
>> cond(A)
ans =
25.1485
When solving with mldivide(), the QR decomposition is used, which avoids this inversion. With A=Q*R,
(A.'*A)^(-1)*A.'*y
=(R.'*R)^(-1)*R.'*Q.'*y
=R^(-1)*Q.'*y
So, the inversion involves only R^(-1) and cond( R )=cond(A).
Matt J
Matt J el 15 de Oct. de 2018
Editada: Matt J el 15 de Oct. de 2018
Here is a test showing the increased error sensitivity of inv(A.'*A)*(A.'*y).
N=1000;
M=15;
A=vander(linspace(1,3.3,M)) + eye(M);
xt=rand(M,1);
yt=A*xt;
y=yt+randn(M,N)*1e-6;
x1=inv(A.'*A)*(A.'*y);
x2=A\y;
Error1=mean( sqrt(sum((x1-xt).^2)) )
Error2=mean( sqrt(sum((x2-xt).^2)) )
should give something like
Error1 =
5.0094
Error2 =
2.8313e-04

Iniciar sesión para comentar.


Image Analyst
Image Analyst el 11 de Oct. de 2018
You cannot get that unless you put in a model curve for that shape. Otherwise functions are not going to know that it's a piecewise linear fit or some sharply kinked log function or whatever. And having more data points would help too. Then you can use fitnlm.
I'm attaching several examples for piecewise linear fit and non-linear fits.
  4 comentarios
Julian
Julian el 11 de Oct. de 2018
Yes that would greatly improve the fit. Can you describe me how it is possible to combine two linear fit into one function in Matlab ?
Alex Sha
Alex Sha el 29 de En. de 2022
For three order of polyfit: y = p1+p2*x+p3*x^2+p4*x^3
Root of Mean Square Error (RMSE): 5.67944047963309E-15
Sum of Squared Residual: 1.2902417664678E-28
Correlation Coef. (R): 1
R-Square: 1
Parameter Best Estimate
---------- -------------
p1 3.92787554473843E-15
p2 0.340654276995852
p3 -0.000698994200659206
p4 3.57048623250019E-7
while, if taking function as: y = p1*x/(p2+x)^2-p3*x
Root of Mean Square Error (RMSE): 0
Sum of Squared Residual: 0
Correlation Coef. (R): 1
R-Square: 1
Parameter Best Estimate
---------- -------------
p1 9035.3274708592
p2 119.630504666011
p3 -0.0199834390844168
Obviously, the second function should be more reasonable than 3rd polyfit.

Iniciar sesión para comentar.

Categorías

Más información sobre Least Squares en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by