Abstract
We revisit the asymptotic bias analysis of the distributed Pareto optimization algorithm developed based on the diffusion strategies. We propose an alternative way to analyze the asymptotic bias of this algorithm at small step-sizes and show that the asymptotic bias descends to zero with a linear dependence on the largest step-size parameter when this parameter is sufficiently small. In addition, through the proposed analytic approach, we provide an expression for the small-step-size asymptotic bias when a condition assumed jointly on the combination matrices and the step-sizes does not strictly hold. This is a likely scenario in practice, which has not been considered in the original paper that introduced the algorithm. Our methodology provides new insights into the inner workings of the diffusion Pareto optimization algorithm while being considerably less involved than the small-step-size asymptotic bias analysis presented in the original work. This is because we take advantage of the special eigenstructure of the composite combination matrix used in the algorithm without calling for any eigenspace decomposition or matrix inversion.
| Original language | English |
|---|---|
| Pages (from-to) | 337-342 |
| Number of pages | 6 |
| Journal | Signal Processing |
| Volume | 130 |
| DOIs | |
| Publication status | Published - 31 May 2017 |
| MoE publication type | A1 Journal article-refereed |
Keywords
- Asymptotic bias
- Diffusion strategies
- Distributed optimization
- Pareto optimization
- Performance analysis
Fingerprint
Dive into the research topics of 'On the asymptotic bias of the diffusion-based distributed pareto optimization'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver