Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Localized Learning: Decentralized Model Updates via Non-Global Objectives

Co-Dream: Collaborative data synthesis with decentralized models

Abhishek Singh · Gauri Gupta · Charles Lu · Yogesh Koirala · Sheshank Shankar · Mohammed Ehab · Ramesh Raskar

Keywords: [ federated learning ] [ Collaborative inference ] [ Deep Learning ] [ Data synthesis ]


Abstract:

We present a framework for distributed optimization that addresses the decentralized and siloed nature of data in the real world. Existing works in Federated Learning address it by learning a centralized model from decentralized data. Our framework \textit{Co-Dream} instead focuses on learning the representation of data itself. By starting with random data and jointly synthesizing samples from distributed clients, we aim to create proxies that represent the global data distribution. Importantly, this collaborative synthesis is achieved using only local models, ensuring privacy comparable to sharing the model itself. The collaboration among clients is facilitated through federated optimization in the data space, leveraging shared input gradients based on local loss. This collaborative data synthesis offers various benefits over collaborative model learning, including lower dimensionality, parameter-independent communication, and adaptive optimization. We empirically validate the effectiveness of our framework and compare its performance with traditional federated learning approaches through benchmarking experiments.

Chat is not available.