Loading [MathJax]/extensions/TeX/ietmacros.js
Quantum Federated Learning with Quantum Networks | IEEE Conference Publication | IEEE Xplore

Quantum Federated Learning with Quantum Networks


Abstract:

A major concern of deep learning models is the large amount of data that is required to build and train them, much of which is reliant on sensitive and personally identif...Show More

Abstract:

A major concern of deep learning models is the large amount of data that is required to build and train them, much of which is reliant on sensitive and personally identifiable information that is vulnerable to access by third parties. Ideas of using the quantum internet to address this issue have been previously proposed, which would enable fast and completely secure online communications. Previous work has yielded a hybrid quantum-classical transfer learning scheme for classical data and communication with a hub-spoke topology. While quantum communication is secure from eavesdrop attacks and no measurements from quantum to classical translation, due to no cloning theorem, hub-spoke topology is not ideal for quantum communication without quantum memory. Here we seek to improve this model by implementing a decentralized ring topology for the federated learning scheme, where each client is given a portion of the entire dataset and only performs training on that set. We also demonstrate the first successful use of quantum weights for quantum federated learning, which allows us to perform our training entirely in quantum.
Date of Conference: 14-19 April 2024
Date Added to IEEE Xplore: 18 March 2024
ISBN Information:

ISSN Information:

Conference Location: Seoul, Korea, Republic of

1. INTRODUCTION

Federated learning has recently emerged as a major tenet of machine learning and deep learning due to growing privacy concerns associated with the large amounts of user data that these models depend on. Such information should not be accessible to third parties, who, upon penetrating the network, can compromise and leak confidential data (e.g., financial data, medical records, etc.). Federated learning mitigates this concern by focusing on a decentralized computer architecture, which keeps each client’s data independent of and private to outsiders. Individuals can train their own models and contribute to the global model without revealing the contents of the model.

Contact IEEE to Subscribe

References

References is not available for this document.