Spotlight Poster
The Role of Randomness in Stability
Max Hopkins · Shay Moran
West Exhibition Hall B2-B3 #W-900
Stability is a central tenet in algorithm design stating that feeding "similar inputs" into an algorithm A should beget "similar outputs". It is well known that any strongly stable (e.g. private) algorithm must be randomized, but despite years of work on the cost of stability in machine learning, we have almost no understanding how much randomness is needed even in basic settings like binary classification.We characterize the amount of randomness needed for a task by the best weak stability achieved by any deterministic algorithm solving the problem. Using connections between weak stability and combinatorial structure in learning, we use this to give the first randomness-efficient stable algorithm for basic learning tasks, along with corresponding lower bounds.In the era of big data, stability is a critical property both to protect user data and ensure reliability of our algorithms. Our work sheds new light on an understudied resource needed to achieve stability in theory and practice.
Live content is unavailable. Log in and register to view live content