This website requires JavaScript.

GLASU: A Communication-Efficient Algorithm for Federated Learning with Vertically Distributed Graph Data

Xinwei ZhangMingyi HongJie Chen
Mar 2023
摘要
Vertical federated learning (VFL) is a distributed learning paradigm, wherecomputing clients collectively train a model based on the partial features ofthe same set of samples they possess. Current research on VFL focuses on thecase when samples are independent, but it rarely addresses an emerging scenariowhen samples are interrelated through a graph. For graph-structured data, graphneural networks (GNNs) are competitive machine learning models, but a naiveimplementation in the VFL setting causes a significant communication overhead.Moreover, the analysis of the training is faced with a challenge caused by thebiased stochastic gradients. In this paper, we propose a model splitting methodthat splits a backbone GNN across the clients and the server and acommunication-efficient algorithm, GLASU, to train such a model. GLASU adoptslazy aggregation and stale updates to skip aggregation when evaluating themodel and skip feature exchanges during training, greatly reducingcommunication. We offer a theoretical analysis and conduct extensive numericalexperiments on real-world datasets, showing that the proposed algorithmeffectively trains a GNN model, whose performance matches that of the backboneGNN when trained in a centralized manner.
展开全部
图表提取

暂无人提供速读十问回答

论文十问由沈向洋博士提出,鼓励大家带着这十个问题去阅读论文,用有用的信息构建认知模型。写出自己的十问回答,还有机会在当前页面展示哦。

Q1论文试图解决什么问题?
Q2这是否是一个新的问题?
Q3这篇文章要验证一个什么科学假设?
0
被引用
笔记
问答