I am using fmincon solve an optimization problem where I only supply an analytical gradient for the objective. For the hessian approximation I would like to try what options works best. However, I don't understand the diagnostics put by fmincon, please see below.
I am using this following call to set up the options for fmincon, where I set BFGS as hessian approximation method:
options = optimoptions(@fmincon, 'GradObj','on','DerivativeCheck','off','MaxFunEvals',1e4,'Display','iter-detailed','algorithm','interior-point', ...
'SpecifyConstraintGradient', true ,'FinDiffType','central','MaxIterations', 5e2, 'HessianApproximation','bfgs' ,'Diagnostics', 'on' )
The options seem to have been accepted by optimoptions as it puts out:
Options used by current Algorithm ('interior-point'):
(Other available algorithms: 'active-set', 'sqp', 'sqp-legacy', 'trust-region-reflective')
Options not used by current Algorithm ('interior-point')
However, when running fmincon with the settings above it says that "finite-differencing (or Quasi-Newton)" is used for the Hessian. What exactly does that mean? Is it using BFGS or not? Why is Quasi-Newton written in brackets only?
Core of the question is actually that the performance (speed and accuracy) of fmnincon is much lower that the one of Knitro trial version (same configuration), and I 'd like to figure out why.
Number of variables: 1200
Objective and gradient: @(x)mhObjective(obj,x)
Hessian: finite-differencing (or Quasi-Newton)
Nonlinear constraints: do not exist
Number of linear inequality constraints: 0