A Generalized Accelerated Composite Gradient Method: Uniting Nesterov's Fast Gradient Method and FISTA

Research output: Contribution to journalArticleScientificpeer-review

1 Citation (Scopus)

Abstract

Numerous problems in signal processing, statistical inference, computer vision, and machine learning, can be cast as large-scale convex optimization problems. Due to their size, many of these problems can only be addressed by first-order accelerated black-box methods. The most popular among these are the Fast Gradient Method (FGM) and the Fast Iterative Shrinkage Thresholding Algorithm (FISTA). FGM requires that the objective be finite and differentiable with a known gradient Lipschitz constant. FISTA is applicable to the broader class of composite objectives and is equipped with a line-search procedure for estimating the Lipschitz constant. Nonetheless, FISTA cannot increase the step size and is unable to take advantage of strong convexity. FGM and FISTA are very similar in form. However, they appear to have vastly differing convergence analyses. In this work we unite the two analyses. We generalize the previously introduced augmented estimate sequence framework as well as the related notion of the gap sequence. We use our tools to construct a Generalized Accelerated Composite Gradient Method that encompasses FGM and FISTA, along with their most popular variants. The Lyapunov property of the generalized gap sequence used in deriving our method implies that both FGM and FISTA are amenable to a Lyapunov analysis, common among optimization algorithms. We further showcase the flexibility of our tools by endowing our method with monotonically decreasing objective function values alongside a versatile line-search procedure. By simultaneously incorporating the strengths of FGM and FISTA, our method is able to surpass both in terms of robustness and usability. We support our findings with simulation results on an extensive benchmark of composite problems. Our experiments show that monotonicity has a stabilizing effect on convergence and challenge the notion present in the literature that, for strongly convex objectives, accelerated proximal schemes can be reduced to fixed momentum methods.

Original languageEnglish
Article number9072504
Pages (from-to)3033-3048
Number of pages16
JournalIEEE Transactions on Signal Processing
Volume68
Early online date2020
DOIs
Publication statusPublished - 2020
MoE publication typeA1 Journal article-refereed

Keywords

  • Convergence
  • Acceleration
  • Gradient methods
  • Signal processing algorithms
  • Linear programming
  • Radio frequency
  • estimate sequence
  • Nesterov method
  • fast gradient method
  • FISTA
  • monotone
  • line-search
  • composite objective
  • large-scale optimization

Fingerprint

Dive into the research topics of 'A Generalized Accelerated Composite Gradient Method: Uniting Nesterov's Fast Gradient Method and FISTA'. Together they form a unique fingerprint.

Cite this