Optimized first-order methods for smooth convex minimization

Math Program. 2016 Sep;159(1):81-107. doi: 10.1007/s10107-015-0949-3. Epub 2015 Oct 17.

Abstract

We introduce new optimized first-order methods for smooth unconstrained convex minimization. Drori and Teboulle [5] recently described a numerical method for computing the N-iteration optimal step coefficients in a class of first-order algorithms that includes gradient methods, heavy-ball methods [15], and Nesterov's fast gradient methods [10,12]. However, the numerical method in [5] is computationally expensive for large N, and the corresponding numerically optimized first-order algorithm in [5] requires impractical memory and computation for large-scale optimization problems. In this paper, we propose optimized first-order algorithms that achieve a convergence bound that is two times smaller than for Nesterov's fast gradient methods; our bound is found analytically and refines the numerical bound in [5]. Furthermore, the proposed optimized first-order methods have efficient forms that are remarkably similar to Nesterov's fast gradient methods.

Keywords: Convergence bound; Fast gradient methods; First-order algorithms; Smooth convex minimization.