fmincon solution does not differ from initial guess if I provide gradient

4 visualizaciones (últimos 30 días)
Hello everyone,
I am currently facing a problem when providing the analytic gradient of the objective function to be minimised.
I need to minimise a functional depending on a state variable vector of length 3*N+4, with N being in the order of 10. I want some linear equality constraints to be respected, hence I impose them by providing a suitable matrix Aeq and a suitable vector beq.
I noticed that the solution is not satisfactory, hence I decided to provide the analytic gradient of the functional. In order to do so, I define my objective function as
function [f,g]=objective_function(x,parameters)
where f is the objective scalar function and g is the gradient vector of size 3*N+4. In my main script, in order for my objective function to depend only on x, I define the handle function
function_to_minimise=@(x) objective_function(x,given_parameters)
where given_parameters are given in the main script.
Of course, I also insert the following option
options=optimoptions('fmincon','GradObj','on');
If I don't provide the gradient, I obtain a solution that is not satisfactory, meaning that what I get as a solution is not consistent with the expected results and the theory.
I noticed that if I provide the gradient, the fmincon routine stops after 1 or 2 iterations and the result is the same as the initial guess. In order to check if the analytic gradient is correct, I checked it with the numerical gradient of fmincon for some well known cases, and the results are the same, so I figured that the analytical gradient that I provide is correct.
Notes:
  • Since I only have equality constraints, I do not have to include the gradient of the constraints.
  • I tried to optimise the functional with all possible
Is there something that I am missing which might cause this problem?
P.s. I hope I gave enought details and that my explanation is clear enough.

Respuesta aceptada

Bruno Luong
Bruno Luong el 9 de Abr. de 2024
To get more info turn the option 'CheckGradients' to 'on', and check the exitflag (third output) of fmincon
  3 comentarios
Bruno Luong
Bruno Luong el 10 de Abr. de 2024
Editada: Bruno Luong el 10 de Abr. de 2024
"Is there a way to solve this problem with fmincon?"
No fmincon is the solver that assumes C1 objective function and constraints, you cannot relax it.
However if your have term with detivative that jump you might tweet it to make C1, exampe replace abs(x) by
x.^2 ./ sqrt(x.^2 + epsilon) to make the function round around x = 0.
Or replace step function (if else) by logistic function ("soft" logical).
Davide Manfredo
Davide Manfredo el 10 de Abr. de 2024
I see, I guess I will do that... Thank you again for your advice!

Iniciar sesión para comentar.

Más respuestas (1)

Torsten
Torsten el 9 de Abr. de 2024
Movida: Torsten el 9 de Abr. de 2024
Use
SpecifyObjectiveGradient
Gradient for the objective function defined by the user. See the description of fun to see how to define the gradient in fun. The default, false, causes fmincon to estimate gradients using finite differences. Set to true to have fmincon use a user-defined gradient of the objective function. To use the 'trust-region-reflective' algorithm, you must provide the gradient, and set SpecifyObjectiveGradient to true.
For optimset, the name is GradObj and the values are 'on' or 'off'. See Current and Legacy Option Names.
instead of 'GradObj','on' if you use "optimoptions" and not "optimset".

Categorías

Más información sobre Solver Outputs and Iterative Display en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by