Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Localized Learning: Decentralized Model Updates via Non-Global Objectives

Internet Learning: Preliminary Steps Towards Highly Fault-Tolerant Learning on Device Networks

Surojit Ganguli · Avi Amalanshu · Amritanshu Ranjan · David I. Inouye

Keywords: [ Distributed Learning ] [ IL ] [ Feature Parallelism ] [ Decentralized Machine Learning ] [ Internet Learning ] [ Internet ] [ Fault Tolerance ] [ Local Learning ] [ Back Propagation ] [ federated learning ] [ Machine Learning ]


Abstract:

Distributed machine learning has grown in popularity due to data privacy, edge computing, and large model training. A subset of this class, Vertical Federated learning (VFL), aims to provide privacy guarantees in the scenario where each party shares the same sample space but only holds a subset of features. While VFL tackles key privacy challenges, it often assumes perfect hardware or communication (and may perform poorly under other conditions). This assumption hinders the broad deployment of VFL, particularly on edge devices, which may need to conserve power and may connect or disconnect at any time. To address this gap, we define the paradigm of Internet Learning (IL), which defines a context, of which VFL is a subset, and puts good performance under extreme dynamic condition of data entities as the primary goal. As IL represents a fundamentally different paradigm, it will likely require novel learning algorithms beyond end-to-end backpropagation, which requires careful synchronization across devices. In light of this, we provide some potential approaches for the IL context and present preliminary analysis and experimental results on a toy problem.

Chat is not available.