Timezone: »
Recent successes in artificial intelligence and machine learning have been largely driven by methods for sophisticated pattern recognition, including deep neural networks and other data-intensive methods. But human intelligence is more than just pattern recognition. And no machine system yet built has anything like the flexible, general-purpose commonsense grasp of the world that we can see in even a one-year-old human infant. I will consider how we might capture the basic learning and thinking abilities humans possess from early childhood, as one route to building more human-like forms of machine learning and thinking.
At the heart of human common sense is our ability to model the physical and social environment around us: to explain and understand what we see, to imagine things we could see but haven’t yet, to solve problems and plan actions to make these things real, and to build new models as we learn more about the world. I will focus on our recent work reverse-engineering these capacities using methods from probabilistic programming, program induction and program synthesis, which together with deep learning methods and video game simulation engines, provide a toolkit for the joint enterprise of modeling human intelligence and making AI systems smarter in more human-like ways.
Author Information
Josh Tenenbaum (MIT)
Joshua Brett Tenenbaum is Professor of Cognitive Science and Computation at the Massachusetts Institute of Technology. He is known for contributions to mathematical psychology and Bayesian cognitive science. He previously taught at Stanford University, where he was the Wasow Visiting Fellow from October 2010 to January 2011. Tenenbaum received his undergraduate degree in physics from Yale University in 1993, and his Ph.D. from MIT in 1999. His work primarily focuses on analyzing probabilistic inference as the engine of human cognition and as a means to develop machine learning.
More from the Same Authors
-
2020 Poster: Visual Grounding of Learned Physical Models »
Yunzhu Li · Toru Lin · Kexin Yi · Daniel Bear · Daniel Yamins · Jiajun Wu · Josh Tenenbaum · Antonio Torralba -
2019 Poster: Learning to Infer Program Sketches »
Maxwell Nye · Luke Hewitt · Josh Tenenbaum · Armando Solar-Lezama -
2019 Oral: Learning to Infer Program Sketches »
Maxwell Nye · Luke Hewitt · Josh Tenenbaum · Armando Solar-Lezama -
2019 Poster: Infinite Mixture Prototypes for Few-shot Learning »
Kelsey Allen · Evan Shelhamer · Hanul Shin · Josh Tenenbaum -
2019 Oral: Infinite Mixture Prototypes for Few-shot Learning »
Kelsey Allen · Evan Shelhamer · Hanul Shin · Josh Tenenbaum -
2019 Poster: Neurally-Guided Structure Inference »
Sidi Lu · Jiayuan Mao · Josh Tenenbaum · Jiajun Wu -
2019 Oral: Neurally-Guided Structure Inference »
Sidi Lu · Jiayuan Mao · Josh Tenenbaum · Jiajun Wu