Approximate Gradient Coding for Privacy-Flexible Federated Learning with Non-IID Data

Okko Makkonen*, Sampo Niemelä*, Camilla Hollanti*, Serge Kas Hanna

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

Abstract

This work focuses on the challenges of non-IID data and stragglers/dropouts in federated learning. We introduce and explore a privacy-flexible paradigm that models parts of the clients’ local data as non-private, offering a more versatile and business-oriented perspective on privacy. Within this framework, we propose a data-driven strategy for mitigating the effects of label heterogeneity and client straggling on federated learning. Our solution combines both offline data sharing and approximate gradient coding techniques. Through theoretical results and numerical simulations on the MNIST dataset, we demonstrate that our approach enables achieving a deliberate trade-off between privacy and utility, leading to improved model convergence and accuracy while using an adaptable portion of non-private data.

Original languageEnglish
Title of host publication32nd European Signal Processing Conference, EUSIPCO 2024 - Proceedings
PublisherEuropean Signal Processing Conference (EUSIPCO)
Pages1167-1171
Number of pages5
ISBN (Electronic)978-9-4645-9361-7
DOIs
Publication statusPublished - 2024
MoE publication typeA4 Conference publication
EventEuropean Signal Processing Conference - Lyon, France
Duration: 26 Aug 202430 Aug 2024
Conference number: 32

Publication series

NameEuropean Signal Processing Conference
ISSN (Print)2219-5491

Conference

ConferenceEuropean Signal Processing Conference
Abbreviated titleEUSIPCO
Country/TerritoryFrance
CityLyon
Period26/08/202430/08/2024

Fingerprint

Dive into the research topics of 'Approximate Gradient Coding for Privacy-Flexible Federated Learning with Non-IID Data'. Together they form a unique fingerprint.

Cite this