Towards Model-Agnostic Federated Learning over Networks

A. Jung*, S. Abdurakhmanova, O. Kuznetsova, Y. Sarcheshmehpour

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review


We present a model-agnostic federated learning method for decentralized data with an intrinsic network structure. The network structure reflects similarities between the (statistics of) local datasets and, in turn, their associated local (“personal”) models. Our method is an instance of empirical risk minimization, with the regularization term derived from the network structure of data. In particular, we require well-connected local models, forming clusters, to yield similar predictions on a common test set. The proposed method allows for a wide range of local models. The only restriction put on these local models is that they allow for efficient implementation of regularized empirical risk minimization (training). Such implementations might be available in the form of high-level programming frameworks such as scikit-learn, Keras or PyTorch.

Original languageEnglish
Title of host publication31st European Signal Processing Conference, EUSIPCO 2023 - Proceedings
Number of pages5
ISBN (Electronic)978-9-4645-9360-0
Publication statusPublished - 2023
MoE publication typeA4 Conference publication
EventEuropean Signal Processing Conference - Helsinki, Finland
Duration: 4 Sept 20238 Sept 2023
Conference number: 31

Publication series

NameEuropean Signal Processing Conference
ISSN (Print)2219-5491


ConferenceEuropean Signal Processing Conference
Abbreviated titleEUSIPCO
Internet address


  • complex networks
  • federated learning
  • heterogeneous
  • non-parametric
  • personalization


Dive into the research topics of 'Towards Model-Agnostic Federated Learning over Networks'. Together they form a unique fingerprint.

Cite this