Learning with Vertically-Partitioned Data, Binary Feedback, and Random Parameter Update

Ngu Nguyen, Stephan Sigg

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

107 Downloads (Pure)


Machine learning models can deal with data samples scattered among distributed agents, each of which holds a nonoverlapping set of sample features. In this paper, we propose a training algorithm that does not require communication between these agents. A coordinator can access ground-truth labels and produces binary feedback to guide the optimization process towards optimal model parameters. We mimic the gradient descent technique with information observed locally at each agent. We experimented with the logistic regression model on multiple benchmark datasets and achieves promising results in terms of convergence rate and communication load.

Original languageEnglish
Title of host publicationINFOCOM 2019 - IEEE Conference on Computer Communications Workshops, INFOCOM WKSHPS 2019
Number of pages6
ISBN (Electronic)9781728118789
Publication statusPublished - 1 Apr 2019
MoE publication typeA4 Conference publication
EventIEEE Conference on Computer Communications - Paris, France
Duration: 29 Apr 20192 May 2019
Conference number: 38

Publication series

NameIEEE Conference on Computer Communications
ISSN (Print)0743-166X


ConferenceIEEE Conference on Computer Communications
Abbreviated titleINFOCOM


  • distributed features
  • distributed optimization
  • gradient descent
  • logistic regression


Dive into the research topics of 'Learning with Vertically-Partitioned Data, Binary Feedback, and Random Parameter Update'. Together they form a unique fingerprint.

Cite this