A Generalized Accelerated Composite Gradient Method: Uniting Nesterov's Fast Gradient Method and FISTA

Research output: Contribution to journalArticleScientificpeer-review


Numerous problems in signal processing, statistical inference, computer vision, and machine learning, can be cast as large-scale convex optimization problems. Due to their size, many of these problems can only be addressed by first-order accelerated black-box methods. The most popular among these are the Fast Gradient Method (FGM) and the Fast Iterative Shrinkage Thresholding Algorithm (FISTA). FGM requires that the objective be finite and differentiable with known gradient Lipschitz constant. FISTA is applicable to the more broad class of composite objectives and is equipped with a line-search procedure for estimating the Lipschitz constant. Nonetheless, FISTA cannot increase the step size and is unable to take advantage of strong convexity. FGM and FISTA are very similar in form. Despite this, they appear to have vastly differing convergence analyses. In this work we generalize the previously introduced augmented estimate sequence framework as well as the related notion of the gap sequence. We showcase the flexibility of our tools by constructing a Generalized Accelerated Composite Gradient Method, that unites FGM and FISTA, along with their most popular variants. The Lyapunov property of the generalized gap sequence used in deriving our method implies that both FGM and FISTA are amenable to a Lyapunov analysis, common among optimization algorithms. We further showcase the flexibility of our tools by endowing our method with monotonically decreasing objective function values alongside a versatile line-search procedure. By simultaneously incorporating the strengths of FGM and FISTA, our method is able to surpass both in terms of robustness and usability. We support our findings with simulation results on an extensive benchmark of composite problems. Our experiments show that monotonicity has a stabilizing effect on convergence and challenge the notion present in the literature that for strongly convex objectives, accelerated proximal schemes can be reduced to fixed momentum methods.
Original languageEnglish
JournalIEEE Transactions on Signal Processing
Publication statusE-pub ahead of print - 2020
MoE publication typeA1 Journal article-refereed


  • Convergence
  • Acceleration
  • Gradient methods
  • Signal processing algorithms
  • Linear programming
  • Radio frequency
  • estimate sequence
  • Nesterov method
  • fast gradient method
  • monotone
  • line-search
  • composite objective
  • large-scale optimization

Fingerprint Dive into the research topics of 'A Generalized Accelerated Composite Gradient Method: Uniting Nesterov's Fast Gradient Method and FISTA'. Together they form a unique fingerprint.

  • Projects

    Transmit beamspace for active compressive sensing and communication with multiple waveforms

    Kocharlakota, K., Upadhya, K., Li, Y., Rizwan Ullah, R., Gao, R., Vorobyov, S. & Ghorbani Veshki, F.


    Project: Academy of Finland: Other research funding

    Cite this