Main Content

Minimization with Gradient and Hessian

This example shows how to solve a nonlinear minimization problem with an explicit tridiagonal Hessian matrix H(x). The problem is to find x to minimize

f(x)=i=1n-1((xi2)(xi+12+1)+(xi+12)(xi2+1)),

where n = 1000.

The helper function brownfgh at the end of this example calculates f(x), its gradient g(x), and its Hessian H(x). To specify that the fminunc solver use the derivative information, set the SpecifyObjectiveGradient and HessianFcn options using optimoptions. To use a Hessian with fminunc, you must use the 'trust-region' algorithm.

options = optimoptions(@fminunc,'Algorithm','trust-region',...
    'SpecifyObjectiveGradient',true,'HessianFcn','objective');

Set the parameter n to 1000, and set the initial point xstart to –1 for odd components and +1 for even components.

n = 1000;
xstart = -ones(n,1);
xstart(2:2:n) = 1;

Find the minimum value of f.

[x,fval,exitflag,output] = fminunc(@brownfgh,xstart,options);
Local minimum found.

Optimization completed because the size of the gradient is less than
the value of the optimality tolerance.

Examine the solution and solution process.

disp(fval)
   2.8709e-17
disp(exitflag)
     1
disp(output)
         iterations: 7
          funcCount: 8
           stepsize: 0.0039
       cgiterations: 7
      firstorderopt: 4.7948e-10
          algorithm: 'trust-region'
            message: 'Local minimum found....'
    constrviolation: []

The function f(x) is a sum of powers of squares, and, therefore, is nonnegative. The solution fval is nearly zero, so it is clearly a minimum. The exit flag 1 also indicates that fminunc finds a solution. The output structure shows that fminunc takes only seven iterations to reach the solution.

Display the largest and smallest elements of the solution.

disp(max(x))
   1.1987e-10
disp(min(x))
  -1.1987e-10

The solution is very near the point where all elements of x = 0.

Helper Function

This code creates the brownfgh helper function.

function [f,g,H] = brownfgh(x)
%BROWNFGH  Nonlinear minimization problem (function, its gradients
% and Hessian)
% Documentation example        

%   Copyright 1990-2008 The MathWorks, Inc.

% Evaluate the function.
  n=length(x); y=zeros(n,1);
  i=1:(n-1);
  y(i)=(x(i).^2).^(x(i+1).^2+1)+(x(i+1).^2).^(x(i).^2+1);
  f=sum(y);
%
% Evaluate the gradient.
  if nargout > 1
     i=1:(n-1); g = zeros(n,1);
     g(i)= 2*(x(i+1).^2+1).*x(i).*((x(i).^2).^(x(i+1).^2))+...
             2*x(i).*((x(i+1).^2).^(x(i).^2+1)).*log(x(i+1).^2);
     g(i+1)=g(i+1)+...
              2*x(i+1).*((x(i).^2).^(x(i+1).^2+1)).*log(x(i).^2)+...
              2*(x(i).^2+1).*x(i+1).*((x(i+1).^2).^(x(i).^2));
  end
%
% Evaluate the (sparse, symmetric) Hessian matrix
  if nargout > 2
     v=zeros(n,1);
     i=1:(n-1);
     v(i)=2*(x(i+1).^2+1).*((x(i).^2).^(x(i+1).^2))+...
            4*(x(i+1).^2+1).*(x(i+1).^2).*(x(i).^2).*((x(i).^2).^((x(i+1).^2)-1))+...
            2*((x(i+1).^2).^(x(i).^2+1)).*(log(x(i+1).^2));
     v(i)=v(i)+4*(x(i).^2).*((x(i+1).^2).^(x(i).^2+1)).*((log(x(i+1).^2)).^2);
     v(i+1)=v(i+1)+...
              2*(x(i).^2).^(x(i+1).^2+1).*(log(x(i).^2))+...
              4*(x(i+1).^2).*((x(i).^2).^(x(i+1).^2+1)).*((log(x(i).^2)).^2)+...
              2*(x(i).^2+1).*((x(i+1).^2).^(x(i).^2));
     v(i+1)=v(i+1)+4*(x(i).^2+1).*(x(i+1).^2).*(x(i).^2).*((x(i+1).^2).^(x(i).^2-1));
     v0=v;
     v=zeros(n-1,1);
     v(i)=4*x(i+1).*x(i).*((x(i).^2).^(x(i+1).^2))+...
            4*x(i+1).*(x(i+1).^2+1).*x(i).*((x(i).^2).^(x(i+1).^2)).*log(x(i).^2);
     v(i)=v(i)+ 4*x(i+1).*x(i).*((x(i+1).^2).^(x(i).^2)).*log(x(i+1).^2);
     v(i)=v(i)+4*x(i).*((x(i+1).^2).^(x(i).^2)).*x(i+1);
     v1=v;
     i=[(1:n)';(1:(n-1))'];
     j=[(1:n)';(2:n)'];
     s=[v0;2*v1];
     H=sparse(i,j,s,n,n);
     H=(H+H')/2;
  end
end

Related Topics