Solve Nonlinear System Without and Including Jacobian
This example shows the reduction in function evaluations when you provide derivatives for a system of nonlinear equations. As explained in Writing Vector and Matrix Objective Functions, the Jacobian of a system of equations is . Provide this derivative as the second output of your objective function.
For example, the multirosenbrock
function is an -dimensional generalization of Rosenbrock's function (see Solve a Constrained Nonlinear Problem, Problem-Based) for any positive, even value of :
The solution of the equation system is the point , .
For this objective function, all the Jacobian terms are zero except the terms where and differ by at most one. For odd values of , the nonzero terms are
The multirosenbrock
helper function at the end of this example creates the objective function and its Jacobian .
Solve the system of equations starting from the point for odd values of , and for even values of . Specify .
n = 64; x0(1:n,1) = -1.9; x0(2:2:n,1) = 2; [x,F,exitflag,output,JAC] = fsolve(@multirosenbrock,x0);
Equation solved. fsolve completed because the vector of function values is near zero as measured by the value of the function tolerance, and the problem appears regular as measured by the gradient.
Examine the distance of the computed solution x
from the true solution, and the number of function evaluations that fsolve
takes to compute the solution.
disp(norm(x-ones(size(x))))
0
disp(output.funcCount)
1043
fsolve
finds the solution, and takes over 1000 function evaluations to do so.
Solve the system of equations again, this time using the Jacobian. To do so, set the 'SpecifyObjectiveGradient'
option to true
.
opts = optimoptions('fsolve','SpecifyObjectiveGradient',true); [x2,F2,exitflag2,output2,JAC2] = fsolve(@multirosenbrock,x0,opts);
Equation solved. fsolve completed because the vector of function values is near zero as measured by the value of the function tolerance, and the problem appears regular as measured by the gradient.
Again, examine the distance of the computed solution x2
from the true solution, and the number of function evaluations that fsolve
takes to compute the solution.
disp(norm(x2-ones(size(x2))))
0
disp(output2.funcCount)
21
fsolve
returns the same solution as the previous solution, but takes about 20 function evaluations to do so, rather than over 1000. In general, using the Jacobian can lower the number of function evaluations and provide increased robustness, although this example does not show improved robustness.
Helper Function
This code creates the multirosenbrock
helper function.
function [F,J] = multirosenbrock(x) % Get the problem size n = length(x); if n == 0, error('Input vector, x, is empty.'); end if mod(n,2) ~= 0 error('Input vector, x ,must have an even number of components.'); end % Evaluate the vector function odds = 1:2:n; evens = 2:2:n; F = zeros(n,1); F(odds,1) = 1-x(odds); F(evens,1) = 10.*(x(evens)-x(odds).^2); % Evaluate the Jacobian matrix if nargout > 1 if nargout > 1 c = -ones(n/2,1); C = sparse(odds,odds,c,n,n); d = 10*ones(n/2,1); D = sparse(evens,evens,d,n,n); e = -20.*x(odds); E = sparse(evens,odds,e,n,n); J = C + D + E; end end