next up previous contents index
Next: The Corrected Gauss--Newton Up: Outline of the Previous: The Modified Gauss--Newton

The Quasi--Newton Method.

This is identical to the modified Gauss--Newton method, except in the way that the Hessian matrix is approximated.

This matrix is first initiated to zero. At each iteration, a new estimation of the Hessian is obtained by adding a rank one or two correction matrix to the last estimate such that , the estimate of the Hessian matrix at the iteration, satisfies

The so-called BFGS updating formulas are applied in this algorithm

where

and

please see Gill, Murray and Pitfield (1972) for more details. After some iterations and around the optimum, converges to the Hessian.

This method requires the knowledge of the derivatives and, as the gradients are only computed once per iteration and consequently, the Hessian is more roughly approximated than with the modified Gauss--Newton method, this is better designed for a great number of parameters i.e. .


Rein Warmels
Mon Jan 22 12:06:29 MET 1996