Following an eventful week at ACL 2019, I summarize key trends and takeaways to look for in NLP.
I describe recurrent neural networks, a deep learning technique geared for tasks such as natural language processing.
We discuss convolutional neural networks, a deep learning model inspired by the human visual system that has rocked the state-of-the-art in computer vision tasks.
We investigate a motley of interesting properties and goodies of neural networks.
We begin a deep dive into deep learning by investigating feedforward neural networks.
We discuss the Transformer, a purely attention-based architecture that is more performant, more efficient, and more parallelizable than recurrent network-based models.
I describe ELMo, a recently released set of neural word representations that are pushing the state-of-the-art in natural language processing pretraining methodology.
I dive into the details of Fast R-CNN, an extension to the original R-CNN model that boasted 9x speedup over its predecessor as well as state-of-the-art object detection results.
A discussion of R-CNN, a historic object detection architecture combining region proposals with convolutional neural networks.
A discussion of fundamental deep learning algorithms people new to the field should learn along with a recommended course of study.
Given all the recent buzz around artificial intelligence, I discuss three reasons for why we are seeing such widespread interest in the field today.
Following my attendance at the 18th Annual Meeting on Discourse and Dialogue, I summarize the most promising directions for future dialogue research, as gleaned from discussions with other researchers at the conference.