Skip to yearly menu bar Skip to main content


Spotlight
in
Workshop: Dynamic Neural Networks

Slimmable Quantum Federated Learning

Won Joon Yun · Jae Pyoung Kim · Soyi Jung · Jihong Park · Mehdi Bennis · Joongheon Kim


Abstract:

Quantum federated learning (QFL) has recently received increasing attention, where quantum neural networks (QNN)s are integrated into federated learning (FL). In contrast to the existing static QFL methods, we propose slimmable QFL (SlimQFL) in this article which is a dynamic QFL framework that can cope with time-varying communication channels and computing energy limitations. This is made viable by leveraging the unique nature of a QNN where its angle parameters and pole parameters can be separately trained and dynamically exploited. Simulation results corroborate that SlimQFL achieves higher classification accuracy than Vanilla QFL, particularly under poor channel conditions on average.

Chat is not available.