Timezone: »

Continual Learning in Linear Classification on Separable Data
Itay Evron · Edward Moroshko · Gon Buzaglo · Maroun Khriesh · Badea Marjieh · Nati Srebro · Daniel Soudry

Tue Jul 25 02:00 PM -- 04:30 PM (PDT) @ Exhibit Hall 1 #807

We analyze continual learning on a sequence of separable linear classification tasks with binary labels. We show theoretically that learning with weak regularization reduces to solving a sequential max-margin problem, corresponding to a special case of the Projection Onto Convex Sets (POCS) framework. We then develop upper bounds on the forgetting and other quantities of interest under various settings with recurring tasks, including cyclic and random orderings of tasks. We discuss several practical implications to popular training practices like regularization scheduling and weighting. We point out several theoretical differences between our continual classification setting and a recently studied continual regression setting.

Author Information

Itay Evron (Technion)
Edward Moroshko (Technion)
Gon Buzaglo (Technion)
Maroun Khriesh (Technion -Israel Institute of Technology)
Badea Marjieh (Technion)
Nati Srebro (Toyota Technological Institute at Chicago)
Daniel Soudry (Technion)

More from the Same Authors