On the convergence result of the gradient-push algorithm on directed graphs with constant stepsize

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Distributed optimization has received a lot of interest due to its wide applications in various fields. It involves multiple agents connected by a graph that optimize a total cost in a collaborative way. Often, in applications, the graph of the agents is a directed graph. The gradient-push algorithm is a fundamental algorithm for distributed optimization when the agents are connected by a directed graph. Despite its wide usage in the literature, its convergence property has not been well established for the important case where the stepsize is constant and the domain is the entire space. This work proves that the gradient-push algorithm with stepsize α>0 converges exponentially fast to an O(α)-neighborhood of the optimizer if the stepsize α is less than a specific value. For the result, we assume that each cost is smooth and the total cost is strongly convex. Numerical experiments are provided to support the theoretical convergence result. We also present a numerical test showing that the gradient-push algorithm may approach a small neighborhood of the minimizer faster than the Push-DIGing algorithm which is a variant of the gradient-push algorithm which involves agents sharing their gradient information.

Original languageEnglish
Pages (from-to)713-736
Number of pages24
JournalJournal of Global Optimization
Volume92
Issue number3
DOIs
StatePublished - Jul 2025

Keywords

  • Convex optimization
  • Gradient-push algorithm
  • Push-sum algorithm

Fingerprint

Dive into the research topics of 'On the convergence result of the gradient-push algorithm on directed graphs with constant stepsize'. Together they form a unique fingerprint.

Cite this