Expo Talk Panel
Hall A8

Private Federated Learning (PFL) is an approach to collaboratively train a machine learning model between edge devices with coordination by a central server, whilst preserving the privacy of the data on each edge device. PFL is an emerging field, with exponential growth in the number of papers published over the past few years and several big tech companies invest heavily in the practical applications of PFL. Researchers commonly perform experiments in a simulation environment to quickly iterate on PFL ideas. However, previous open-source tools do not offer the efficiency required to simulate FL on larger and more realistic FL datasets. We introduce pfl-research (https://github.com/apple/pfl-research), a fast, modular, and easy-to-use Python framework for simulating FL and PFL, 7-72x faster than alternative open-source frameworks. In this talk, we will start by briefly introducing Federated Learning as a subject and describe techniques to preserve the privacy of participating edge devices. We will quickly dive deeper into the unique challenges encountered in PFL, explore research problems that persist today. Then we will introduce pfl-research, its features and performance. We demonstrate how researchers can use pfl-research to significantly boost their productivity with intuitive interfaces and fast distributed simulations, including fine-tuning LLMs in PFL.

Chat is not available.