quasinew

Purpose

Quasi-Newton optimization.

Description

[x, options, flog, pointlog] = quasinew(f, x, options, gradf) uses a quasi-Newton algorithm to find a local minimum of the function f(x) whose gradient is given by gradf(x). Here x is a row vector and f returns a scalar value. The point at which f has a local minimum is returned as x. The function value at that point is returned in options(8). A log of the function values after each cycle is (optionally) returned in flog, and a log of the points visited is (optionally) returned in pointlog.

quasinew(f, x, options, gradf, p1, p2, ...) allows additional arguments to be passed to f() and gradf().

The optional parameters have the following interpretations.

options(1) is set to 1 to display error values; also logs error values in the return argument errlog, and the points visited in the return argument pointslog. If options(1) is set to 0, then only warning messages are displayed. If options(1) is -1, then nothing is displayed.

options(2) is a measure of the absolute precision required for the value of x at the solution. If the absolute difference between the values of x between two successive steps is less than options(2), then this condition is satisfied.

options(3) is a measure of the precision required of the objective function at the solution. If the absolute difference between the objective function values between two successive steps is less than options(3), then this condition is satisfied. Both this and the previous condition must be satisfied for termination.

options(9) should be set to 1 to check the user defined gradient function.

options(10) returns the total number of function evaluations (including those in any line searches).

options(11) returns the total number of gradient evaluations.

options(14) is the maximum number of iterations; default 100.

options(15) is the precision in parameter space of the line search; default 1e-2.

Examples

An example of the use of the additional arguments is the minimization of an error function for a neural network:

w = quasinew('neterr', w, options, 'netgrad', net, x, t);

Algorithm

The quasi-Newton algorithm builds up an approximation to the inverse Hessian over a number of steps. The method requires order W squared storage, where W is the number of function parameters. The Broyden-Fletcher-Goldfarb-Shanno formula for the inverse Hessian updates is used. The line searches are carried out to a relatively low precision (1.0e-2).

See Also

conjgrad, graddesc, linemin, minbrack, scg
Pages: Index

Copyright (c) Ian T Nabney (1996-9)