Timezone: »

 
Optimal Parameter-free Online Learning with Switching Cost
Zhiyu Zhang · Ashok Cutkosky · Ioannis Paschalidis

Parameter-freeness in online learning refers to the adaptivity of an algorithm with respect to the optimal decision in hindsight. In this paper, we design such algorithms in the presence of switching cost - the latter penalizes the optimistic updates required by parameter-freeness, leading to a delicate design trade-off. Based on a novel dual space scaling strategy, we propose a simple yet powerful algorithm for Online Linear Optimization (OLO) with switching cost, which improves the existing suboptimal regret bound (Zhang et al., 2022a) to the optimal rate. The obtained benefit is extended to the expert setting, and the practicality of our algorithm is demonstrated through a sequential investment task.

Author Information

Zhiyu Zhang (Boston University)

Hello! I am a PhD student at Boston University, advised by Prof. Yannis Paschalidis and Prof. Ashok Cutkosky. I am broadly interested in the theoretical aspects of machine learning, optimization and control theory. Specifically, I work on adaptive online learning, i.e., designing online decision making algorithms that optimally exploit problem structures.

Ashok Cutkosky (Boston University)
Ioannis Paschalidis (Boston University)

More from the Same Authors