Globalization of Barzilai and Borwein Method for Unconstrained Optimization
The focus of this thesis is on finding the unconstrained minimizer of a function. Specifically, we will focus on the Barzilai and Borwein (BB) method that is a famous two-point stepsize gradient method. First we briefly give some mathematical background. Then we discuss the (BB) method that is im...
Saved in:
Main Author: | |
---|---|
Format: | Thesis |
Language: | English English |
Published: |
2009
|
Online Access: | http://psasir.upm.edu.my/id/eprint/10387/1/IPM_2009_10_A.pdf http://psasir.upm.edu.my/id/eprint/10387/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The focus of this thesis is on finding the unconstrained minimizer of a function.
Specifically, we will focus on the Barzilai and Borwein (BB) method that is a famous
two-point stepsize gradient method. First we briefly give some mathematical
background. Then we discuss the (BB) method that is important in the area of
optimization. A review of the minimization methods currently available that can be
used to solve unconstrained optimization is also given.
Due to BB method’s simplicity, low storage and numerical efficiency, the Barzilai
and Borwein method has received a good deal of attention in the optimization
community but despite all these advances, stepsize of BB method is computed by
means of simple approximation of Hessian in the form of scalar multiple of identity
and especially the BB method is not monotone, and it is not easy to generalize the
method to general nonlinear functions. Due to the presence of these deficiencies, we
introduce new gradient-type methods in the frame of BB method including a new gradient method via weak secant equation (quasi-Cauchy relation), improved
Hessian approximation and scaling the diagonal updating.
The proposed methods are a kind of fixed step gradient method like that of Barzilai
and Borwein method. In contrast with the Barzilai and Borwein approach’s in which
stepsize is computed by means of simple approximation of the Hessian in the form of
scalar multiple of identity, the proposed methods consider approximation of Hessian
in diagonal matrix. Incorporate with monotone strategies, the resulting algorithms
belong to the class of monotone gradient methods with globally convergence.
Numerical results suggest that for non-quadratic minimization problem, the new
methods clearly outperform the Barzilai- Borwein method.
Finally we comment on some achievement in our researches. Possible extensions are
also given to conclude this thesis. |
---|