Inductive Two-Layer Modeling with Parametric Bregman Transfer
Vignesh Ganapathiraman · Zhan Shi · Xinhua Zhang · Yaoliang Yu

Thu Jul 12th 01:30 -- 01:50 PM @ A6

Latent prediction models, exemplified by multi-layer networks, employ hidden variables that automate abstract feature discovery. They typically pose nonconvex optimization problems and effective semi-definite programming (SDP) relaxations have been developed to enable global solutions (Aslan et al., 2014).However, these models rely on nonparametric training of layer-wise kernel representations, and are therefore restricted to transductive learning which slows down test prediction. In this paper, we develop a new inductive learning framework for parametric transfer functions using matching losses. The result for ReLU utilizes completely positive matrices, and the inductive learner not only delivers superior accuracy but also offers an order of magnitude speedup over SDP with constant approximation guarantees.

Author Information

Vignesh Ganapathiraman (University of Illinois at Chicago)
Zhan Shi (University of Illinois at Chicago)
Xinhua Zhang (University of Illinois at Chicago)
Yaoliang Yu (University of Waterloo)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors