Timezone: »

 
Tutorial
Parameter-free Online Optimization
Francesco Orabona · Ashok Cutkosky

Mon Jul 13 05:00 AM -- 08:00 AM & Mon Jul 13 03:00 PM -- 06:00 PM (PDT) @

Classical stochastic optimization results typically assume known values for various properties of the data (e.g. Lipschitz constants, distance to an optimal point, smoothness or strong-convexity constants). Unfortunately, in practice these values are unknown, necessitating a long trial-and-error procedure to find the best parameters. To address this issue, in recent years a number of parameter-free algorithms have been developed for online optimization and for online learning. Parameter-free algorithms make no assumptions about the properties of the data and yet nevertheless converge just as fast as the optimally tuned algorithm. This is an exciting line of work that has now reached enough maturity to be taught to general audiences. Indeed,these algorithms have not received a proper introduction to the machine learning community and only a handful of people fully understand them. This tutorial aims at bridging this gap, presenting practice and theory for using and designing parameter-free algorithms. We will present the latest advancements in this field, including practical applications.

Author Information

Francesco Orabona (Boston University)
Francesco Orabona

Francesco Orabona is an Associate Professor at KAUST. His background covers both theoretical and practical aspects of machine learning and optimization. His current research interests lie in online learning, and more generally the problem of designing and analyzing adaptive and parameter-free learning algorithms. He received the PhD degree in Electrical Engineering at the University of Genoa in 2007. He is (co)author of more than 60 peer reviewed papers.

Ashok Cutkosky (Boston University)
Ashok Cutkosky

Ashok Cutkosky is a a research scientist at Google, and will be joining Boston University in the fall of 2020. He obtained his PhD in Computer Science from Stanford University in 2018. Prior to his work in optimization theory, he developed a background in pure mathematics as well as biology. He is currently working on parameter-free algorithms for practical online and stochastic optimization. He has given talks and written several papers on this topic at major conferences, and received the Best Student Paper award at COLT 2017 for his work in this area. He also enjoys magic tricks and science fiction.

More from the Same Authors