This website requires JavaScript.

Quantum federated learning based on gradient descent

Kai YuXin ZhangZi YeGong-De GuoSong Lin
Dec 2022
摘要
Federated learning is a distributed learning framework in machine learning,and has been widely studied recently. Generally speaking, there are two mainchallenges, high computational cost and the security of the transmittedmessage, in the federated learning process. To address these challenges, weutilize some intriguing characteristics of quantum mechanics to propose aframework for quantum federated learning based on gradient descent. In theproposed framework, it consists of two components. One is a quantum gradientdescent algorithm, which has been demonstrated that it can achieve exponentialacceleration in dataset scale and quadratic speedup in data dimensionality overthe classical counterpart. Namely, the client can fast-train gradients on aquantum platform. The other is a quantum secure multi-party computationprotocol that aims to calculate federated gradients safely. The securityanalysis is shown that this quantum protocol can resist some common externaland internal attacks. That is, the local gradient can be aggregated securely.Finally, to illustrated the effectiveness of the proposed framework, we applyit to train federated linear regression models and successfully implement somekey computation steps on the Qiskit quantum computing framework.
展开全部
图表提取

暂无人提供速读十问回答

论文十问由沈向洋博士提出,鼓励大家带着这十个问题去阅读论文,用有用的信息构建认知模型。写出自己的十问回答,还有机会在当前页面展示哦。

Q1论文试图解决什么问题?
Q2这是否是一个新的问题?
Q3这篇文章要验证一个什么科学假设?
0
被引用
笔记
问答