Timezone: »
Interpreting decision making logic in demonstration videos is key to collaborating with and mimicking humans. To empower machines with this ability, we propose a neural program synthesizer that is able to explicitly synthesize underlying programs from behaviorally diverse and visually complicated demonstration videos. We introduce a summarizer module as part of our model to improve the network’s ability to integrate multiple demonstrations varying in behavior. We also employ a multi-task objective to encourage the model to learn meaningful intermediate representations for end-to-end training. We show that our model is able to reliably synthesize underlying programs as well as capture diverse behaviors exhibited in demonstrations. The code is available at https://shaohua0116.github.io/demo2program.
Author Information
Shao-Hua Sun (University of Southern California)
Hyeonwoo Noh (POSTECH)
Sriram Somasundaram (University of Southern California)
Joseph Lim (Univ. of Southern California)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Oral: Neural Program Synthesis from Diverse Demonstration Videos »
Fri Jul 13th 08:10 -- 08:20 AM Room K1
More from the Same Authors
-
2020 Poster: Generalization to New Actions in Reinforcement Learning »
Ayush Jain · Andrew Szot · Joseph Lim