Analytical hessian for large scale optimization in fmincon

6 visualizaciones (últimos 30 días)
Hi everyone,
I have currently set up an optimization problem in Matlab which stems from an inverse problem in design. I tested my setup with small test cases (decision space ~ 80 variables) and fmincon works like a charm. I did some calculations to get rough estimates on my actual optimization problem and here is what it looks like:
Decison space: ~ 20,000 variables
Nonlinear equality constraints: ~15.000
Linear inequality constraints: ~5,000
Clearly, its a large scale optimization problem and, per Matlab's recommendations, I should use the interior-point method. I would also like to note that I do have access to the gradient of the cost function and possibly Hessian (likely going to be ridiculously large, and I haven't coded it up yet). Here are a few questions that I would very much appreciate responses to:
  1. Can fmincon efficiently handle a problem such as mine, or am I better off using a dedicated commercial solver?
  2. I think I would absolutely need to use l-bfgs for my hessian approximation (when I don't pass an analytical hessian). Because l-bfgs approximates hessian using limited memory, does passing an analytical hessian worth it? The reason I ask is because I am under the impression that if I pass an analytical hessian, I will not be able to use limited memory approximation of hessian and that the interior-point algorithm is going to use the full hessian per iteration to converge to a solution.
  3. Does fmincon work if I use the sparse function to pass an analytical hessian? (I am fairly certain that my Hessian is going to be sparse)
Thank you for time. Please feel free to ask for any clarification.

Respuesta aceptada

Matt J
Matt J el 24 de Abr. de 2021
Can fmincon efficiently handle a problem such as mine, or am I better off using a dedicated commercial solver?
The only way to find out is to try...
Does fmincon work if I use the sparse function to pass an analytical hessian? (I am fairly certain that my Hessian is going to be sparse)
Yes. You can also use a Hessian Multiply Function to conserve memory,
  5 comentarios
Matt J
Matt J el 28 de Abr. de 2021
As I recall it, the general theory of BFGS is that, for quadratic functions, it will converge in at most N iterations where N is the number of unknowns. With an exact Hessian of course, convergence occurs within 1 iteration.
Chaitanya Awasthi
Chaitanya Awasthi el 28 de Abr. de 2021
Yes, I believe that is true.

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Nonlinear Optimization en Help Center y File Exchange.

Productos


Versión

R2019b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by