Published May 26, 2017
10 - A Syntactic Neural Model for General-Purpose Code Generation
Explore cutting-edge techniques in code generation with Peng Cheng Yin and Graham Newbig from Carnegie Mellon University, as they delve into a sequence-to-tree approach that combines neural networks with grammar formalisms for translating natural language into executable code, pushing the boundaries of semantic parsing and neural models.

Topics covered
Popular Clips
Episode Highlights
Related Episodes
04 - Recurrent Neural Network Grammars, with Chris Dyer
Answers 383 questions25 - Neural Semantic Parsing over Multiple Knowledge-bases
Answers 383 questions05 - Transition-Based Dependency Parsing with Stack Long Short-Term Memory
Answers 383 questions61 - Neural Text Generation in Stories, with Elizabeth Clark and Yangfeng Ji
Answers 383 questions29 - Neural machine translation via binary code prediction, with Graham Neubig
Answers 383 questions09 - Learning to Generate Reviews and Discovering Sentiment
Answers 383 questions91 - (Executable) Semantic Parsing, with Jonathan Berant
Answers 383 questions22 - Deep Multitask Learning for Semantic Dependency Parsing, with Noah Smith
Answers 383 questions63 - Neural Lattice Language Models, with Jacob Buckman
Answers 383 questions42 - Generating Sentences by Editing Prototypes, with Kelvin Guu
Answers 383 questions17 - pix2code: Generating Code from a Graphical User Interface Screenshot
Answers 383 questions138 - Compositional Generalization in Neural Networks, with Najoung Kim
Answers 383 questions