Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Federated Learning and Analytics in Practice: Algorithms, Systems, Applications, and Opportunities

Asynchronous Federated Learning with Bidirectional Quantized Communications and Buffered Aggregation

Tomas Ortega · Hamid Jafarkhani


Abstract:

Asynchronous Federated Learning with Buffered Aggregation (FedBuff) is a state-of-the-art algorithm known for its efficiency and high scalability.However, it has a high communication cost, which has not been examined with quantized communications.To tackle this problem, we present a new algorithm (QAFeL), with a quantization scheme that establishes a shared "hidden'' state between the server and clients to avoid the error propagation caused by direct quantization.This approach allows for high precision while significantly reducing the data transmitted during client-server interactions.We provide theoretical convergence guarantees for QAFeLand corroborate our analysis with experiments on a standard benchmark.

Chat is not available.