How to constraint the values of fitted parameters with lsqcurvefit?

18 visualizaciones (últimos 30 días)
I am fitting a multiexponential function to a data set, but the solutions that lsqcurvefit is finding are away from realistic values. If I wanted to constraint the value of the parameters, to be greater than 0 but to not be grater that other value, lets say 0.001, how would I do it?
I will include my code below:
clear; clc; clf; close all;
D_0 = [0.5 0.00001 0.5 0.00001]; %initial guess
xdata = [9.85 32.05 66.83 114.44 174.55 247.57 333.00 431.44 542.19 666.04 802.12 951.39 1113.38 1287.47 1474.87 1674.29 1887.11 2600.14 2863.78 3139.17 3428.23 4043.41 4369.45 4709.33 5425.99 5802.67 6193.38 6595.39];
ydata = [1 0.9569 0.9528 0.8894 0.8387 0.8995 0.7911 0.773 0.7523 0.7155 0.7086 0.6478 0.6269 0.6175 0.574 0.551 0.4991 0.4559 0.4449 0.4314 0.4212 0.407 0.3856 0.3511 0.3526 0.303 0.3148 0.2912];
fun = @(D,xdata) D(1)*exp(-xdata.*D(2))+ D(3)*(exp(-xdata.*D(4)));
D = lsqcurvefit(fun, D_0, xdata, ydata);
semilogy(xdata, ydata, 'ko', xdata, fun(D,xdata), 'b-')
legend('Data', 'Fit')
format short
D_1 = D(1)
D_2 = D(2)
D_3 = D(3)
D_4 = D(4)
% I need to constraint the 2nd and 4th parameter to be below 0.001, even if
% it is not the most perfect fit.
% I this case D_2 is greater than 0.001

Respuesta aceptada

Alex Sha
Alex Sha el 29 de Abr. de 2022
hi, the result is good enough
Sum Squared Error (SSE): 0.0105245967805521
Root of Mean Square Error (RMSE): 0.01938758511131
Correlation Coef. (R): 0.99609231161877
R-Square: 0.992199893266025
Parameter Best Estimate
---------- -------------
d1 0.364389939850661
d2 0.00133809356253748
d3 0.610149012103863
d4 0.000110756764815313
  1 comentario
Alfredo Scigliani
Alfredo Scigliani el 29 de Abr. de 2022
I know that the fitting is good, but it is a false minimum. This fitting tells me about properties of a material in which something above 0.001 is just non sensical. There shold be another solution to the data fitting that the function is not getting to because it is stopping the iterations at this one.

Iniciar sesión para comentar.

Más respuestas (1)

Mathieu NOE
Mathieu NOE el 29 de Abr. de 2022
hello
I admit, my solution is the "poor man" solution as I don't have the optimization toolbox. But even with fminsearch I could do a 3 parameters fit that looks ok (fig 2, right ) by forcing the last parameter (d) to be = 0.001. If you let the 4 parameters free, d would slightly overshoot the limit (0.0013) , as displayed in fig 1 .
visualy , both solutions seems ok
clear; clc; clf; close all;
D_0 = [0.5 0.00001 0.5 0.00001]; %initial guess
xdata = [9.85 32.05 66.83 114.44 174.55 247.57 333.00 431.44 542.19 666.04 802.12 951.39 1113.38 1287.47 1474.87 1674.29 1887.11 2600.14 2863.78 3139.17 3428.23 4043.41 4369.45 4709.33 5425.99 5802.67 6193.38 6595.39];
ydata = [1 0.9569 0.9528 0.8894 0.8387 0.8995 0.7911 0.773 0.7523 0.7155 0.7086 0.6478 0.6269 0.6175 0.574 0.551 0.4991 0.4559 0.4449 0.4314 0.4212 0.407 0.3856 0.3511 0.3526 0.303 0.3148 0.2912];
%% 4 parameters fminsearch optimization
f = @(a,b,c,d,x) a.*exp(-b.*x) + c.*exp(-d.*x);
obj_fun = @(params) norm(f(params(1), params(2), params(3), params(4),xdata)-ydata);
sol = fminsearch(obj_fun, [0.5,1e-4,0.5,1e-4]);
a = sol(1);
b = sol(2)
c = sol(3);
d = sol(4)
xfit = linspace(min(xdata),max(xdata),100);
yfit = f(a, b, c, d, xfit);
figure(1);
semilogy(xdata, ydata, '+', 'MarkerSize', 10, 'LineWidth', 2)
hold on
semilogy(xfit, yfit, '-');
%% 3 parameters fminsearch optimization
d = 1e-3;
f = @(a,b,c,x) a.*exp(-b.*x) + c.*exp(-d.*x);
obj_fun = @(params) norm(f(params(1), params(2), params(3),xdata)-ydata);
sol = fminsearch(obj_fun, [0.5,1e-4,0.5,1e-4]);
a = sol(1);
b = sol(2)
c = sol(3);
xfit = linspace(min(xdata),max(xdata),100);
yfit = f(a, b, c, xfit);
figure(2);
semilogy(xdata, ydata, '+', 'MarkerSize', 10, 'LineWidth', 2)
hold on
semilogy(xfit, yfit, '-');
  3 comentarios
Alfredo Scigliani
Alfredo Scigliani el 29 de Abr. de 2022
Forcing it to be 0.001 is still a problem for me, the answer should be around 0.0001 or even smaller. What I am looking for is to constraint it to be below it. Forcing it to a fixed value means assuming it to be that value.
Mathieu NOE
Mathieu NOE el 9 de Mayo de 2022
ok
I tried to see if my R² parameters evolves in a certain amount if I do a for loop to test with diffrent d values. here i test with 1000 values of d in log spacing from 10^-5 to 10^-2
there are two optimal points but in fact it's more or less the same solution , becuase parameters can be flipped (like b and d)
a = 3.914218228252134e-01 6.020081918678649e-01
b = 1.479928472725444e-03 1.073153563776005e-04
d = 1.079028791516184e-04 1.403289084785873e-03
Rsquared = 9.935951065959850e-01 9.938844449832044e-01
clear; clc; clf; close all;
xdata = [9.85 32.05 66.83 114.44 174.55 247.57 333.00 431.44 542.19 666.04 802.12 951.39 1113.38 1287.47 1474.87 1674.29 1887.11 2600.14 2863.78 3139.17 3428.23 4043.41 4369.45 4709.33 5425.99 5802.67 6193.38 6595.39];
ydata = [1 0.9569 0.9528 0.8894 0.8387 0.8995 0.7911 0.773 0.7523 0.7155 0.7086 0.6478 0.6269 0.6175 0.574 0.551 0.4991 0.4559 0.4449 0.4314 0.4212 0.407 0.3856 0.3511 0.3526 0.303 0.3148 0.2912];
%% 2 parameters fminsearch optimization
dd = logspace(-5,-2,1000);
for ci =1:numel(dd)
d = dd(ci);
f = @(a,b,x) a.*exp(-b.*x) + (1-a).*exp(-d.*x);
obj_fun = @(params) norm(f(params(1), params(2),xdata)-ydata);
sol = fminsearch(obj_fun, [0.5,1e-3]);
a(ci) = sol(1);
b(ci) = sol(2);
xfit = linspace(min(xdata),max(xdata),100);
yfit = f(a(ci), b(ci), xfit);
Rsquared(ci) = my_Rsquared_coeff(interp1(xdata,ydata,xfit),yfit); % correlation coefficient
end
% finding the best (or two best sets)
Rsquared_s = smoothdata(Rsquared,'gaussian',25);
Rsquared_s(Rsquared_s<=0.9) = 0.9;
ind = find(islocalmax(Rsquared_s));
figure(1);
semilogx(dd,Rsquared,dd(ind),Rsquared(ind),'dr');
title('R² vs d parameter ');
xlabel(' d parameter ');
ylabel(' R² ');
format long e
a = a(ind)
b = b(ind)
d = dd(ind)
Rsquared = Rsquared(ind)
yfit1 = a(1).*exp(-b(1).*xfit) + (1-a(1)).*exp(-d(1).*xfit);
yfit2 = a(2).*exp(-b(2).*xfit) + (1-a(2)).*exp(-d(2).*xfit);
figure(3);
semilogy(xdata, ydata, '+', 'MarkerSize', 10, 'LineWidth', 2)
hold on
semilogy(xfit, yfit1, '-', xfit, yfit2, '-');
title('Data fit ');
legend('data',['fit #1 , R² = ' num2str(Rsquared(1))],['fit #2 , R² = ' num2str(Rsquared(2))]);
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function Rsquared = my_Rsquared_coeff(data,data_fit)
% R2 correlation coefficient computation
% The total sum of squares
sum_of_squares = sum((data-mean(data)).^2);
% The sum of squares of residuals, also called the residual sum of squares:
sum_of_squares_of_residuals = sum((data-data_fit).^2);
% definition of the coefficient of correlation is
Rsquared = 1 - sum_of_squares_of_residuals/sum_of_squares;
end

Iniciar sesión para comentar.

Categorías

Más información sobre Get Started with Curve Fitting Toolbox en Help Center y File Exchange.

Productos


Versión

R2019b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by