How to Finetune GPT3

I discuss how to finetune GPT3, a state-of-the-art large language model that is revolutionizing natural language processing and understanding systems.

Deploying a State-of-the-Art Question Answering System With 60 Lines of Python Using HuggingFace and Streamlit

I show how you can deploy a performant question-answering system with just 60 lines of Python.

Trends in Natural Language Processing: ACL 2019 In Review

Following an eventful week at ACL 2019, I summarize key trends and takeaways to look for in NLP.

Transformers: Attention in Disguise

We discuss the Transformer, a purely attention-based architecture that is more performant, more efficient, and more parallelizable than recurrent network-based models.

Deep Contextualized Word Representations with ELMo

I describe ELMo, a recently released set of neural word representations that are pushing the state-of-the-art in natural language processing pretraining methodology.

Review of SIGDial/SemDial 2017

Following my attendance at the 18th Annual Meeting on Discourse and Dialogue, I summarize the most promising directions for future dialogue research, as gleaned from discussions with other researchers at the conference.

A New Multi-Turn, Multi-Domain, Task-Oriented Dialogue Dataset

To help spur conversational assistant research, we release a corpus of 3,031 grounded, multi-turn dialogues in three distinct domains appropriate for an in-car assistant: calendar scheduling, weather information retrieval, and point-of-interest navigation.