## Personalized Federated Learning through Local Memorization

### Othmane Marfoq · Giovanni Neglia · Richard Vidal · Laetitia Kameni

##### Hall E #724

Keywords: [ MISC: Transfer, Multitask and Meta-learning ] [ OPT: Large Scale, Parallel and Distributed ]

[ Abstract ]
[ [
Wed 20 Jul 3:30 p.m. PDT — 5:30 p.m. PDT

Spotlight presentation: Deep Learning/Optimization
Wed 20 Jul 1:30 p.m. PDT — 3 p.m. PDT

Abstract: Federated learning allows clients to collaboratively learn statistical models while keeping their data local. Federated learning was originally used to train a unique global model to be served to all clients, but this approach might be sub-optimal when clients' local data distributions are heterogeneous. In order to tackle this limitation, recent personalized federated learning methods train a separate model for each client while still leveraging the knowledge available at other clients. In this work, we exploit the ability of deep neural networks to extract high quality vectorial representations (embeddings) from non-tabular data, e.g., images and text, to propose a personalization mechanism based on local memorization. Personalization is obtained by interpolating a collectively trained global model with a local $k$-nearest neighbors (kNN) model based on the shared representation provided by the global model. We provide generalization bounds for the proposed approach in the case of binary classification, and we show on a suite of federated datasets that this approach achieves significantly higher accuracy and fairness than state-of-the-art methods.

Chat is not available.