Timezone: »

 
Private Multi-Task Learning: Formulation and Applications to Federated Learning
Shengyuan Hu · Steven Wu · Virginia Smith

Many problems in machine learning rely on multi-task learning (MTL), in which the goal is to solve multiple related machine learning tasks simultaneously. MTL is particularly relevant for privacy-sensitive applications in areas such as healthcare, finance, and IoT computing, where sensitive data from multiple, varied sources are shared for the purpose of learning. In this work, we formalize notions of task-level privacy for MTL via joint differential privacy(JDP), a relaxation of differential privacy for mechanism design and distributed optimization. We then propose an algorithm for mean-regularized MTL, an objective commonly used for applications in personalized federated learning, subject to JDP. We analyze our objective and solver, providing certifiable guarantees on both privacy and utility. Empirically, our method allows for improved privacy/utility trade-offs relative to global baselines across common federated learning benchmarks.

Author Information

Shengyuan Hu (Carnegie Mellon University)
Steven Wu (Carnegie Mellon University)
Virginia Smith (Carnegie Mellon University)
Virginia Smith

Virginia Smith is an assistant professor in the Machine Learning Department at Carnegie Mellon University, and a courtesy faculty member in the Electrical and Computer Engineering Department. Her research interests span machine learning, optimization, and distributed systems. Prior to CMU, Virginia was a postdoc at Stanford University, received a Ph.D. in Computer Science from UC Berkeley, and obtained undergraduate degrees in Mathematics and Computer Science from the University of Virginia.

More from the Same Authors