A Fast Finite-Time Consensus based Gradient Method for Distributed Optimization over Digraphs

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review


In this paper, we study the unconstrained optimization problem in a distributed way over directed strongly connected communication graphs. We propose an algorithm, which combines techniques of both gradient descent (GD) and finite-time exact ratio consensus (FTERC). Different from the techniques of average or dynamic average consensus with asymptotic convergence or techniques of finite-time “approximate” consensus with inexact accuracy in the literature, with the help of FTERC for gradient tracking, our proposed distributed FTERC based GD algorithm has a faster convergence rate related to the optimization iteration number and a larger step-size upper bound compared with other algorithms, as demonstrated in the simulations.
Original languageEnglish
Title of host publication2022 IEEE 61st Conference on Decision and Control (CDC)
Number of pages7
ISBN (Electronic)978-1-6654-6761-2
Publication statusPublished - 2022
MoE publication typeA4 Article in a conference publication
EventIEEE Conference on Decision and Control - Cancun, Mexico
Duration: 6 Dec 20229 Dec 2022
Conference number: 61

Publication series

NameProceedings of the IEEE Conference on Decision & Control
ISSN (Electronic)2576-2370


ConferenceIEEE Conference on Decision and Control
Abbreviated titleCDC


  • Gradient methods
  • Upper bound
  • Costs
  • Additives
  • Heuristic algorithms
  • Directed graphs
  • Approximation algorithms
  • Distributed optimization
  • gradient tracking
  • finite-time consensus
  • directed graphs
  • gradient descent


Dive into the research topics of 'A Fast Finite-Time Consensus based Gradient Method for Distributed Optimization over Digraphs'. Together they form a unique fingerprint.

Cite this