How can i draw shaded confidence region with nlparci and lsqcurvfit?
9 comentarios
Respuesta aceptada
Hi @Khadija,
Please see my response to your mentioned queries below.
Addressing your query regarding, “I'm just trying to understand”
It sounds like you want to assess the precision of your model and make informed decisions based on the uncertainty present in the data. After going through some research on your given equations, I did realize that they both need to work together because the first equation lsqcurvefit(@simulatedhs,k0,tforward,[Hdata,HSdata],lb,ub) performs the curve fitting operation, optimizing the parameters k to minimize the residual sum of squares and right after that you need to use second equation nlparci(k,Rsd,'jacobian',Jmat) to calculate the confidence intervals for the parameters estimated by lsqcurvefit using the Jacobian matrix Jmat. Both equations not only fit the data but also provide insights into the uncertainty of the estimated parameters through confidence intervals and the shaded region provides a clear visual indication of the precision and reliability of the estimated parameters.
Now, addressing your query regarding, “I point out that I am a beginner in Matlab programming, especially the part of parameter estimation and associated topics. “
I agree with you because ensuring the correct interpretation and representation of the confidence intervals in the plot requires attention to detail and a good understanding of the statistical concepts involved
Going back to, “It is a challenge, and I will stick to it”
Drawing a shaded confidence region using nlparci and lsqcurvefit in Matlab can be a challenging task, especially when dealing with nonlinear curve fitting.
Now, addressing , “How can i draw shaded confidence region with nlparci and lsqcurvfit?”
First, defining the nonlinear equation function by implementing the function simulatedhs which defines the nonlinear equation used for curve fitting. It takes parameters k and time t as inputs and returns the calculated values based on the equation y = k(1) * exp(-k(2) * t) + k(3). Then, defining initial guess and parameter bounds for k0 which represents the initial guess for the parameters [k1, k2, k3], lb and ub which are the lower and upper bounds for the parameters, respectively. Then, generating sample data where tforward is a vector representing time points and Hdata and HSdata are generated using the simulatedhs function with added noise. Afterwards, setting optimization options using a function which is used to set options for the optimization process, such as the algorithm, display, and tolerance. Then, performing non linear curve fitting using the lsqcurvefit function which fits the curve using the simulatedhs function, initial guess k0, sample data Hdata, and bounds lb and ub. It returns the optimized parameters k, residuals, and other information. Then, calculating confidence interval using the nlparci function which calculates the confidence interval (CI) for the parameters using the Jacobian matrix Jmat. Finally, plotting the results to make sure that the fitted curve and confidence interval are plotted using the optimized parameters and the calculated bounds. If you observe the plot, you will notice that the fitted curve is plotted in blue, and the confidence region is shaded in cyan. Here is updated code,
[k,Rsdnrm,Rsd,ExFlg,OptmInfo,Lmda,Jmat] = lsqcurvefit(@simulatedhs,k0,tforward,[Hdata,HSdata],lb,ub);
CI = nlparci(k,Rsd,'jacobian',Jmat);
% Define the equation for nonlinear curve fitting
function y = simulatedhs(k, t)
y = k(1) * exp(-k(2) * t) + k(3);
end
% Improved initial guess for parameters
k0 = [1, 0.1, 0.5];
% Adjusted bounds for parameters
lb = [0, 0, 0];
ub = [10, 10, 10];
% Generate sample data for t and Hdata, HSdata
tforward = linspace(0, 10, 100); % Example time points
Hdata = simulatedhs([2, 0.5, 1], tforward) + 0.1*randn(size(tforward)); % Example data
HSdata = simulatedhs([2, 0.5, 1], tforward) + 0.1*randn(size(tforward)); % Example data
options = optimoptions('lsqcurvefit', 'Algorithm', 'trust-region-reflective', 'Display', 'iter', 'FunctionTolerance', 1e-12);
[k, Rsdnrm, Rsd, ExFlg, OptmInfo, Lmda, Jmat] = lsqcurvefit(@simulatedhs, k0, tforward, Hdata, lb, ub, options);
CI = nlparci(k, Rsd, 'jacobian', Jmat);
disp('CI');
disp(CI);
% Assuming CI has been calculated correctly
lower_bound = CI(:, 1);
upper_bound = CI(:, 2);
disp('Lower Bound:');
disp(lower_bound);
disp('Upper Bound:');
disp(upper_bound);
% Generate data points for plotting
x = linspace(min(tforward), max(tforward), 100);
y_fit = simulatedhs(k, x); %Fitted curve
% Calculate upper and lower bounds for the confidence interval
y_lower = simulatedhs(CI(:, 1), x);
y_upper = simulatedhs(CI(:, 2), x);
% Plot the fitted curve and shaded confidence region
figure;
plot(x, y_fit, 'b', 'LineWidth', 2); % Fitted curve
hold on;
fill([x, fliplr(x)], [y_lower, fliplr(y_upper)], 'c', 'FaceAlpha', 0.3); % Shaded region xlabel('X-axis');
ylabel('Y-axis');
title('Fitted Curve with Confidence Region');
legend('Fitted Curve', 'Confidence Region');
grid on;
Please see attached results and plots.


Optimization Process Tips
When trying to compute lsqcurvefit, you need to play with play with certain parameters during the optimization process such as improving the initial guess for the parameters (k0), this initial guess plays a crucial role in the optimization process. It should be close to the expected optimal values to help the algorithm converge faster and you can try adjusting the initial guess based on prior knowledge or by analyzing the data. Adjusting the bounds for the parameters (lb and ub) because the bounds restrict the search space for the optimization algorithm. If the bounds are too narrow, the algorithm may not be able to explore the entire parameter space and find the global minimum. You can try widening the bounds to allow for more flexibility in the parameter estimation and finally, changing the optimization algorithm because the lsqcurvefit function allows you to specify the optimization algorithm to be used. The default algorithm used in this code was 'trust-region-reflective', but you can try using a different algorithm to see if it improves the convergence. Some alternative algorithms include 'levenberg-marquardt' and 'sqp'.
Interpretation of Optimization completed because the size of the gradient is less than the value of the optimality tolerance
Local minimum found.
Optimization completed because the size of the gradient is less than the value of the optimality tolerance.
Optimization completed: The first-order optimality measure, 1.684030e-07, is less than options.OptimalityTolerance = 1.000000e-06.
When you encounter the message "Optimization completed because the size of the gradient is less than the value of the optimality tolerance," it indicates that the optimization algorithm has terminated successfully based on the specified optimality tolerance criteria. This message suggests that the gradient of the objective function has reached a sufficiently small value, indicating that the algorithm has converged to a solution within the defined tolerance limits. It signifies that further iterations are unlikely to significantly improve the solution, hence the optimization process is considered complete.
For more details, please click the links below.
https://www.mathworks.com/help/optim/msg_csh/local-minimum-found-1.html
https://www.mathworks.com/help/optim/ug/when-the-solver-succeeds.html
Please feel free to customize the code and parameters based on your preferences. If you have any further questions, please let me know.
0 comentarios
Más respuestas (0)
Ver también
Categorías
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!