General Least Squares Fit
7 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I have a set of experimental data and "model data" that represent a an ideal model for the experiment. The experimental data and "model data" share the same x values but obviously have different y values.
The model data can not be represented by any well behaved closed form function or polynomial.
Is there a function in Matlab I can use to do non-linear least squares fit of the data to the model without resorting to writing least squares code?
0 comentarios
Respuestas (4)
Richard Willey
el 13 de Sept. de 2011
I am somewhat confused by the question.
Regression analysis is used to estimate a set of regression coefficients than minimizes the difference between predicted and actual. In turn, this requires that you have some kind of equation that describes the relationship between your variables. If you don't have an equation, what (precisely) are you estimating?
Hypothetically, I can think of some interesting ways that you might take to address your problem.
For example, you could do the following:
1. Bootstrap the dataset for your model.
2. Identify the a LOESS spanning parameter that minimizes the difference between your localized regression model and the true curve.
3. Repeat this some arbitrarily large number of times and take the average of the spanning parameters
4. Use this derived value as the optimal spanning parameter for a localized regression model that is fit to your experimental data.
I have a function called fitit on the File Exchange that you could modify to do this without too much trouble. (It uses cross validation to estimate an optimal smoothing parameter for a localized regression model and then uses a bootstrap to estimate confidence intervals)
0 comentarios
Richard Willey
el 14 de Sept. de 2011
Thanks for clarifying.
From the sounds of things, you need some kind of solution for non-parametric fitting.
The choice of algorithm will (largely) depend on the dimensionality of your data. If you're working with a low dimension data set (1-2 independent variable) you'll probably want to use a localized regression model or perhaps a smoothing spline.
If you're working with a high dimensional data set then your best option is either some kind of decision tree (Statistics Toolbox supports boosted and bagged decision trees) or, alternatively, a neural network.
0 comentarios
Ver también
Categorías
Más información sobre Linear and Nonlinear Regression en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!