Correlation regression lines between two parameters
9 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Amjad Iqbal
el 8 de Nov. de 2021
Comentada: Adam Danz
el 9 de Nov. de 2021
Hello dear Researchers,
I have a query need your expertise to resolve.
I want to add correlation regression for two paramters for comparison purpose.
I have attached sub-plots which are scatter plots among two orientations.
Now purpose is to add correlation regression line i.e., one plot I added from a reference.
0 comentarios
Respuesta aceptada
yanqi liu
el 9 de Nov. de 2021
Editada: yanqi liu
el 9 de Nov. de 2021
sir, may be upload some data, use lsqcurvefit、polyfit and so on to compute the coef
please check the follow code to get some information
clc; clear all; close all;
xdata = linspace(0,3);
ydata = 1.3*xdata + 0.05*randn(size(xdata));
lb = [];
ub = [];
fun = @(x,xdata)x(1)*xdata+x(2);
x0 = [0,0];
x = lsqcurvefit(fun,x0,xdata,ydata,lb,ub)
plot(xdata,ydata,'ko',xdata,fun(x,xdata),'r-','LineWidth',2)
legend('Data','Fitted exponential')
title('Data and Fitted Curve')
10 comentarios
Adam Danz
el 9 de Nov. de 2021
Yes, lsqcurvefit will provide the same results as polyfit or fitlm but the latter two are designed for linear models and do not require making initial guesses to the parameter values. I'm not trying to convince anyone to change their approach (or their selected answer). I'm arguing that lsqcurvefit is not the best tool for linear regression.
polyfit is much more efficient than lsqcurvefit (95x faster):
x = rand(1,100);
y = 4.8*x+2.1 +rand(1,100);
n = 5000;
tic
for i = 1:n
opts = struct('Display','off');
fun = @(x,xdata)x(1)*xdata+x(2);
p = lsqcurvefit(fun, [0,0],x,y,[],[],opts);
end
T1 = toc % time in seconds
p
tic
for i = 1:n
p2 = polyfit(x,y,1);
end
T2 = toc % time in seconds
p2 % same results
% Difference in time
T1/T2
Más respuestas (1)
Adam Danz
el 8 de Nov. de 2021
Editada: Adam Danz
el 8 de Nov. de 2021
Common methods of adding a simple linear regression line
Examples of #1 & #2
Example of #3
2 comentarios
Image Analyst
el 9 de Nov. de 2021
Just to build on Adam's #3, this is how you'd do it:
% First get your model for fitted y values.
% Determine and say how well we did with our predictions, numerically, using several metrics like RMSE and MAE.
% Fit a linear model between predicted and true so we can get the R squared.
mdl = fitlm(actualy, fittedy)
rSquared = mdl.Rsquared.Ordinary;
rmse = mdl.RMSE;
mdlMSE = mdl.MSE;
residuals = abs(y - x);
% Find the mean and median absolute deviations of the elements in X.
meandev = mad(residuals,0,'all');
mediandev = mad(residuals,1,'all');
aveResidual = mean(residuals, 'omitnan');
maxResidual = max(residuals);
fprintf('The RMSE value is %.5f.\n', rmse);
fprintf('The R^2 value is %.5f.\n', rSquared);
fprintf('The MSE value is %.5f.\n', mdlMSE);
fprintf('The fitted y is different from the actual y by an average of %.2f units.\n', aveResidual);
fprintf('The fitted y is different from the actual y by an average of %.2f units.\n', meandev);
fprintf('The fitted y is different from the actual y by a median of %.2f units.\n', mediandev);
fprintf('The max residual from the average human grade is %.2f PSU.\n', maxResidual);
Ver también
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!