How to set up an optimization problem to minimize the sum of squared residuals using the Genetic Algorithm?
2 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Victor Assis
el 3 de Mayo de 2014
Editada: Victor Assis
el 4 de Mayo de 2014
Hello, my name is Victor Assis and I am a student from Brazil. I have been working hard on a problem that I could not quite get myself, and I need your help. This is the deal:
I have to fit an equation to a data set. But my equation does have linear and nonlinear parameters. My idea is to simply use OLS, with a slightly difference: I am gonna anchor the value of my linear parameters on the value of the nonlinears. To do this I just set up the OLS classical problem, and computed the partial derivatives of the linear parameters. Once I got those, I am using this as a constraint (but I just substituted it for the values in the sum of the squared residuals).
the problem that i am facing is that I don't seem to understand how to set up this problem in Matlab. When I am trying to set up an objective function I don't understand how to define what is parameters and what is the dataset that i am gonna use.
I don't know if i made myself clear, but i will print the equation i am trying to fit to my dataset here just in case:
y= A + B*(tc-t)^(z)+C*(tc-t)^(z)*cos(w*log(tc-t)+phi)
Linear parameters that will be anchored : A,B, C Parameters estimated by the Genetic algorithm : tc,z,w,phi Dataset used: y,t (both are column vectors Nx1
I appreciate your help. Thank you very much.
0 comentarios
Respuesta aceptada
Star Strider
el 3 de Mayo de 2014
To parameterise your function to be used in MATLAB regression functions, you need to write your function as:
% b(1) = tc, b(2) = z, b(3) = w, b(4) = phi
yfit = @(b,t,A,B,C) A + B.*(b(1)-t).^b(2) + (C.*(b(1)-t).^b(2)).*cos(b(3).*log(b(1)-t)+b(4));
or, if you put A, B, and C inside the function rather than calling them as arguments, the function becomes:
yfit = @(b,t) A + B.*(b(1)-t).^b(2) + (C.*(b(1)-t).^b(2)).*cos(b(3).*log(b(1)-t)+b(4));
The nonlinear regression functions ( nlinfit, lsqcurvefit ) will compute the Jacobian for you. If you use a genetic algorithm, you will not need the Jacobian because it does not use it.
See the documentation for various functions for details in using them. You may have to experiment with different choices of initial parameter estimates in order for your regression to converge.
8 comentarios
Star Strider
el 4 de Mayo de 2014
Editada: Star Strider
el 4 de Mayo de 2014
Good point!
% b(1) = tc, b(2) = z, b(3) = w, b(4) = phi, b(5) = A, b(6) = B, b(7) = C
yfit = @(b,t) b(5) + b(6).*(b(1)-t).^b(2) + (b(7).*(b(1)-t).^b(2)).*cos(b(3).*log(b(1)-t)+b(4));
That should work. The SS function doesn’t change.
My pleasure! I’m glad I could help.
Más respuestas (1)
Victor Assis
el 4 de Mayo de 2014
2 comentarios
Star Strider
el 4 de Mayo de 2014
My pleasure!
Considering that you appear to be an Economist, I just might!
Ver también
Categorías
Más información sobre Genetic Algorithm en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
