Timezone: »

 
Poster
Refined bounds for algorithm configuration: The knife-edge of dual class approximability
Nina Balcan · Tuomas Sandholm · Ellen Vitercik

Thu Jul 16 06:00 AM -- 06:45 AM & Thu Jul 16 06:00 PM -- 06:45 PM (PDT) @

Automating algorithm configuration is growing increasingly necessary as algorithms come with more and more tunable parameters. It is common to tune parameters using machine learning, optimizing algorithmic performance (runtime or solution quality, for example) using a training set of problem instances from the specific domain at hand. We investigate a fundamental question about these techniques: how large should the training set be to ensure that a parameter’s average empirical performance over the training set is close to its expected, future performance? We answer this question for algorithm configuration problems that exhibit a widely-applicable structure: the algorithm's performance as a function of its parameters can be approximated by a “simple” function. We show that if this approximation holds under the L∞-norm, we can provide strong sample complexity bounds, but if the approximation holds only under the Lp-norm for p < ∞, it is not possible to provide meaningful sample complexity bounds in the worst case. We empirically evaluate our bounds in the context of integer programming, obtaining sample complexity bounds that are up to 700 times smaller than the previously best-known bounds.

Author Information

Nina Balcan (Carnegie Mellon University)
Nina Balcan

Maria-Florina Balcan is an Associate Professor in the School of Computer Science at Carnegie Mellon University. Her main research interests are machine learning and theoretical computer science. Her honors include the CMU SCS Distinguished Dissertation Award, an NSF CAREER Award, a Microsoft Faculty Research Fellowship, a Sloan Research Fellowship, and several paper awards. She has served as a Program Committee Co-chair for COLT 2014, a Program Committee Co-chair for ICML 2016, and a board member of the International Machine Learning Society.

Tuomas Sandholm (Carnegie Mellon University)

Tuomas Sandholm is Angel Jordan Professor of Computer Science at Carnegie Mellon University. He is Founder and Director of the Electronic Marketplaces Laboratory. He has published over 450 papers. With his student Vince Conitzer, he initiated the study of automated mechanism design in 2001. In parallel with his academic career, he was Founder, Chairman, and CTO/Chief Scientist of CombineNet, Inc. from 1997 until its acquisition in 2010. During this period the company commercialized over 800 of the world's largest-scale generalized combinatorial multi-attribute auctions, with over $60 billion in total spend and over $6 billion in generated savings. He is Founder and CEO of Optimized Markets, Strategic Machine, and Strategy Robot. Also, his algorithms run the UNOS kidney exchange, which includes 69% of the transplant centers in the US. He has developed the leading algorithms for several general classes of game. The team that he leads is the two-time world champion in computer Heads-Up No-Limit Texas Hold’em poker, and Libratus became the first and only AI to beat top humans at that game. Among his many honors are the NSF Career Award, inaugural ACM Autonomous Agents Research Award, Sloan Fellowship, Carnegie Science Center Award for Excellence, Edelman Laureateship, Newell Award for Research Excellence, and Computers and Thought Award. He is Fellow of the ACM, AAAI, and INFORMS. He holds an honorary doctorate from the University of Zurich.

Ellen Vitercik (Carnegie Mellon University)

Ellen Vitercik is a PhD student in computer science at Carnegie Mellon University. Her primary research interests are artificial intelligence, machine learning, theoretical computer science, and computational economics. Her honors include a National Science Foundation Graduate Research Fellowship and a Microsoft Research Women's Fellowship.

More from the Same Authors