On the acceleration of the Barzilai–Borwein method
Document Type
Article
Publication Date
4-1-2022
Abstract
The Barzilai–Borwein (BB) gradient method is efficient for solving large-scale unconstrained problems to modest accuracy due to its ingenious stepsize which generally yields nonmonotone behavior. In this paper, we propose a new stepsize to accelerate the BB method by requiring finite termination for minimizing the two-dimensional strongly convex quadratic function. Based on this new stepsize, we develop an efficient gradient method for quadratic optimization which adaptively takes the nonmonotone BB stepsizes and certain monotone stepsizes. Two variants using retard stepsizes associated with the new stepsize are also presented. Numerical experiments show that our strategies of properly inserting monotone gradient steps into the nonmonotone BB method could significantly improve its performance and our new methods are competitive with the most successful gradient descent methods developed in the recent literature.
Publication Source (Journal or Book title)
Computational Optimization and Applications
First Page
717
Last Page
740
Recommended Citation
Huang, Y., Dai, Y., Liu, X., & Zhang, H. (2022). On the acceleration of the Barzilai–Borwein method. Computational Optimization and Applications, 81 (3), 717-740. https://doi.org/10.1007/s10589-022-00349-z