How to Finetune GPT3
I discuss how to finetune GPT3, a state-of-the-art large language model that is revolutionizing natural language processing and understanding systems.
I discuss how to finetune GPT3, a state-of-the-art large language model that is revolutionizing natural language processing and understanding systems.
I show how you can deploy a performant question-answering system with just 60 lines of Python.
Following an eventful week at ACL 2019, I summarize key trends and takeaways to look for in NLP.
We discuss the Transformer, a purely attention-based architecture that is more performant, more efficient, and more parallelizable than recurrent network-based models.
I describe ELMo, a recently released set of neural word representations that are pushing the state-of-the-art in natural language processing pretraining methodology.
Following my attendance at the 18th Annual Meeting on Discourse and Dialogue, I summarize the most promising directions for future dialogue research, as gleaned from discussions with other researchers at the conference.
To help spur conversational assistant research, we release a corpus of 3,031 grounded, multi-turn dialogues in three distinct domains appropriate for an in-car assistant: calendar scheduling, weather information retrieval, and point-of-interest navigation.