Skip to yearly menu bar Skip to main content


Tutorial

Recent Advances in Population-Based Search for Deep Neural Networks: Quality Diversity, Indirect Encodings, and Open-Ended Algorithms

Jeff Clune · Joel Lehman · Kenneth Stanley

Hall A

Abstract:

We will cover new, exciting, unconventional techniques for improving population-based search. These ideas are already enabling us to solve hard problems. They also hold great promise for further advancing machine learning, including deep neural networks. Major topics covered include (1) explicitly searching for behavioral diversity (in a low-dimensional space where diversity is inherently interesting, such as the behavior of robots, rather than in the true search space, such as the weights of the DNN that controls the robot), especially Quality Diversity algorithms, which have produced state-of-the-art results in robotics and solved a version of the hard-exploration RL challenge of Montezuma’s Revenge; (2) open-ended search, wherein algorithms continually create new and increasingly complex capabilities without bound, for example by simultaneously inventing new challenges and their solutions; and (3) indirect encoding (e.g. HyperNEAT/HyperNetworks), wherein one network encodes how to construct a larger neural network or learning system. The idea is motivated by biological development, wherein a search in the space of a few thousand genes enables the specification of a trillion-connection brain and its learning algorithm. We conclude with a discussion on current and future hybrids of traditional machine learning with these ideas, including how augmenting meta-learning with them offers an alternative path to our most ambitious AI goals.

Live content is unavailable. Log in and register to view live content