Timezone: »
We propose a modular architecture for lifelong learning of multiple hierarchically structured tasks. Specifically, we prove that our architecture is theoretically able to learn tasks that can be solved by functions that are learnable given access to functions for other, previously learned tasks as subroutines. We empirically show that some tasks that we can learn in this way are not learned by current modular lifelong learning or end-to-end training methods in practice; indeed, prior work suggests that some such tasks cannot be learned by \emph{any} efficient method without the aid of the simpler tasks. We also consider methods for identifying the tasks automatically, without relying on explicitly given indicators.
Author Information
ZIHAO DENG (Washington University in St. Louis)
Zee Fryer (Google Research)
Brendan Juba (Washington University in St Louis)
Rina Panigrahy (Google)
Xin Wang (Google)
More from the Same Authors
-
2022 : For Manifold Learning, Deep Neural Networks can be Locality Sensitive Hash Functions »
Nishanth Dikkala · Gal Kaplun · Rina Panigrahy -
2022 : A Theoretical View on Sparsely Activated Networks »
Cenk Baykal · Nishanth Dikkala · Rina Panigrahy · Cyrus Rashtchian · Xin Wang -
2021 Poster: Probabilistic Generating Circuits »
Honghua Zhang · Brendan Juba · Guy Van den Broeck -
2021 Oral: Probabilistic Generating Circuits »
Honghua Zhang · Brendan Juba · Guy Van den Broeck