Timezone: »
We address the problem of any-code completion - generating a missing piece of source code in a given program without any restriction on the vocabulary or structure. We introduce a new approach to any-code completion that leverages the strict syntax of programming languages to model a code snippet as a tree - structural language modeling (SLM). SLM estimates the probability of the program's abstract syntax tree (AST) by decomposing it into a product of conditional probabilities over its nodes. We present a neural model that computes these conditional probabilities by considering all AST paths leading to a target node. Unlike previous techniques that have severely restricted the kinds of expressions that can be generated in this task, our approach can generate arbitrary code in any programming language. Our model significantly outperforms both seq2seq and a variety of structured approaches in generating Java and C# code. Our code, data, and trained models are available at http://github.com/tech-srl/slm-code-generation/. An online demo is available at http://AnyCodeGen.org.
Author Information
Uri Alon (Technion - Israel Institute of Technology)
Roy Sadaka (Technion)
Omer Levy (Tel Aviv University / Facebook AI Research)
Eran Yahav (Technion)
More from the Same Authors
-
2021 Poster: Thinking Like Transformers »
Gail Weiss · Yoav Goldberg · Eran Yahav -
2021 Spotlight: Thinking Like Transformers »
Gail Weiss · Yoav Goldberg · Eran Yahav -
2018 Poster: Extracting Automata from Recurrent Neural Networks Using Queries and Counterexamples »
Gail Weiss · Yoav Goldberg · Eran Yahav -
2018 Oral: Extracting Automata from Recurrent Neural Networks Using Queries and Counterexamples »
Gail Weiss · Yoav Goldberg · Eran Yahav