On the asymptotic bias of the diffusion-based distributed pareto optimization

Reza Arablouei*, Kutluyıl Doğançay, Stefan Werner, Yih Fang Huang

*Corresponding author for this work

    Research output: Contribution to journalArticleScientificpeer-review


    We revisit the asymptotic bias analysis of the distributed Pareto optimization algorithm developed based on the diffusion strategies. We propose an alternative way to analyze the asymptotic bias of this algorithm at small step-sizes and show that the asymptotic bias descends to zero with a linear dependence on the largest step-size parameter when this parameter is sufficiently small. In addition, through the proposed analytic approach, we provide an expression for the small-step-size asymptotic bias when a condition assumed jointly on the combination matrices and the step-sizes does not strictly hold. This is a likely scenario in practice, which has not been considered in the original paper that introduced the algorithm. Our methodology provides new insights into the inner workings of the diffusion Pareto optimization algorithm while being considerably less involved than the small-step-size asymptotic bias analysis presented in the original work. This is because we take advantage of the special eigenstructure of the composite combination matrix used in the algorithm without calling for any eigenspace decomposition or matrix inversion.

    Original languageEnglish
    Pages (from-to)337-342
    Number of pages6
    JournalSignal Processing
    Publication statusPublished - 31 May 2017
    MoE publication typeA1 Journal article-refereed


    • Asymptotic bias
    • Diffusion strategies
    • Distributed optimization
    • Pareto optimization
    • Performance analysis


    Dive into the research topics of 'On the asymptotic bias of the diffusion-based distributed pareto optimization'. Together they form a unique fingerprint.

    Cite this