Published Jul 23, 2018
61 - Neural Text Generation in Stories, with Elizabeth Clark and Yangfeng Ji
Elizabeth Clark and Yangfeng Ji explore their groundbreaking work on neural text generation, detailing how entity representation can enhance storytelling coherence and the potential to revolutionize creative writing with advanced language models.

Topics covered
Popular Clips
Episode Highlights
Related Episodes
108 - Data-To-Text Generation, with Verena Rieser and OndrÌŒej DusÌŒek
Answers 383 questions64 - Neural Network Models for Sentence Pair Tasks, with Wuwei Lan and Wei Xu
Answers 383 questions139 - Coherent Long Story Generation, with Kevin Yang
Answers 383 questions120 - Evaluation of Text Generation, with Asli Celikyilmaz
Answers 383 questions10 - A Syntactic Neural Model for General-Purpose Code Generation
Answers 383 questions89 - Dialog Systems, with Zhou Yu
Answers 383 questions42 - Generating Sentences by Editing Prototypes, with Kelvin Guu
Answers 383 questions119 - Social NLP, with Diyi Yang
Answers 383 questions63 - Neural Lattice Language Models, with Jacob Buckman
Answers 383 questions09 - Learning to Generate Reviews and Discovering Sentiment
Answers 383 questions04 - Recurrent Neural Network Grammars, with Chris Dyer
Answers 383 questions87 - Pathologies of Neural Models Make Interpretation Difficult, with Shi Feng
Answers 383 questions