Abstract
This work focuses on the challenges of non-IID data and stragglers/dropouts in federated learning. We introduce and explore a privacy-flexible paradigm that models parts of the clients’ local data as non-private, offering a more versatile and business-oriented perspective on privacy. Within this framework, we propose a data-driven strategy for mitigating the effects of label heterogeneity and client straggling on federated learning. Our solution combines both offline data sharing and approximate gradient coding techniques. Through theoretical results and numerical simulations on the MNIST dataset, we demonstrate that our approach enables achieving a deliberate trade-off between privacy and utility, leading to improved model convergence and accuracy while using an adaptable portion of non-private data.
Original language | English |
---|---|
Title of host publication | 32nd European Signal Processing Conference, EUSIPCO 2024 - Proceedings |
Publisher | European Signal Processing Conference (EUSIPCO) |
Pages | 1167-1171 |
Number of pages | 5 |
ISBN (Electronic) | 978-9-4645-9361-7 |
DOIs | |
Publication status | Published - 2024 |
MoE publication type | A4 Conference publication |
Event | European Signal Processing Conference - Lyon, France Duration: 26 Aug 2024 → 30 Aug 2024 Conference number: 32 |
Publication series
Name | European Signal Processing Conference |
---|---|
ISSN (Print) | 2219-5491 |
Conference
Conference | European Signal Processing Conference |
---|---|
Abbreviated title | EUSIPCO |
Country/Territory | France |
City | Lyon |
Period | 26/08/2024 → 30/08/2024 |