What is the Data Science Life Cycle?

I describe the data science life cycle, a methodology for effectively developing and deploying data-driven projects.

A Complete Machine Learning Project From Scratch: Model Deployment and Continuous Integration

In this fifth post in a series on how to build a complete machine learning product from scratch, I describe how to deploy our model and set up a continuous integration system.

A Complete Machine Learning Project From Scratch: Error Analysis And Model V2

In this fourth post in a series on how to build a complete machine learning product from scratch, I describe how to error analyze our first model and work toward building a V2 model.

A Complete Machine Learning Project From Scratch: Model V1

In this third post in a series on how to build a complete machine learning product from scratch, I describe how to build an initial model with an associated training/evaluation pipeline and functionality tests.

A Complete Machine Learning Project From Scratch: Exploratory Data Analysis

In this second post in a series on how to build a complete machine learning product from scratch, I describe how to acquire your dataset and perform initial exploratory data analysis.

A Complete Machine Learning Project From Scratch: Setting Up

In this first post in a series on how to build a complete machine learning product from scratch, I describe how to setup your project and tooling.

We Don't Need Data Scientists, We Need Data Engineers

After analyzing 1000+ Y-Combinator Companies, I discover there's a huge market need for more engineering-focused data practitioner roles.

Deploying a State-of-the-Art Question Answering System With 60 Lines of Python Using HuggingFace and Streamlit

I show how you can deploy a performant question-answering system with just 60 lines of Python.

A Complete 4-Year Course Plan for an Artificial Intelligence Undergraduate Degree

I describe in-depth the courses necessary for a 4-year undergraduate degree in artificial intelligence, assuming you step onto campus tomorrow.

Trends in Machine Learning: NeurIPS 2019 In Review

Following an eventful week at NeurIPS 2019, I summarize some key trends in machine learning today.

The Birth of Venus: Building a Deep Learning Computer From Scratch

In which I describe how I built a deep learning computer starting from nothing but a pile of hardware components and a dream.

A Comprehensive Introduction to Machine Learning Fundamentals

I provide an aggregated and comprehensive list of tutorials I have made for fundamental concepts in machine learning.

Decision Trees

I discuss decision trees which are very powerful, general-purpose models that are also interpretable.

Market Basket Analysis

I discuss market basket analysis, an unsupervised learning technique for understanding and quantifying the relationships between sets of items.

Your Closest Neighbors

I discuss the k-nearest neighbors algorithm, a remarkably simple but effective machine learning model.

Model Evaluation

In which I describe commonly used techniques for evaluating machine learning models.

Join the Ensemble

In which I discuss the technique of ensembling which is used to improve the performance of a single machine learning model by combining the power of several other models.

Why Did You Choose This Model?

I discuss strategies such as cross-validation which are used for selecting best-performing machine learning models.

Model Regularization

In which I discuss regularization, a strategy for controlling a model's generalizability to new datasets.

Principal Components Analysis

In which I give a primer on principal components analysis, a commonly used technique for dimensionality reduction in machine learning.

What Features Do You Want?

In which I discuss feature selection which is used in machine learning to help improve model generalization, reduce feature dimensionality, and do other useful things.

Controlling Your Model's Bias

In which I describe the bias-variance tradeoff, one of the most important concepts underlying all of machine learning theory.

Trends in Natural Language Processing: ACL 2019 In Review

Following an eventful week at ACL 2019, I summarize key trends and takeaways to look for in NLP.

What K-Means

In which we investigate K-means clustering, a common unsupervised clustering technique for analyzing data.

Recurrent Neural Networks

I describe recurrent neural networks, a deep learning technique geared for tasks such as natural language processing.

Dance Dance Convolution

We discuss convolutional neural networks, a deep learning model inspired by the human visual system that has rocked the state-of-the-art in computer vision tasks.

Neural Network Grab Bag

We investigate a motley of interesting properties and goodies of neural networks.

Deep Dive Into Neural Networks

We begin a deep dive into deep learning by investigating feedforward neural networks.

Basics of Support Vector Machines

We discuss support vector machines, a very powerful and versatile machine learning model.

Naive Bayes Classifier Tutorial

I describe Naive Bayes, a commonly-used generative model for a variety of classification tasks.

Logistic Regression in Machine Learning Tutorial

I describe logistic-regression, one of the cornerstone algorithms of the modern-day machine learning toolkit.

Fundamentals of Linear Regression

I describe the basics of linear regression, one of the most common and widely used machine learning techniques.

Transformers: Attention in Disguise

We discuss the Transformer, a purely attention-based architecture that is more performant, more efficient, and more parallelizable than recurrent network-based models.

Deep Contextualized Word Representations with ELMo

I describe ELMo, a recently released set of neural word representations that are pushing the state-of-the-art in natural language processing pretraining methodology.

Fast Object Detection with Fast R-CNN

I dive into the details of Fast R-CNN, an extension to the original R-CNN model that boasted 9x speedup over its predecessor as well as state-of-the-art object detection results.

Object Detection with R-CNN

A discussion of R-CNN, a historic object detection architecture combining region proposals with convolutional neural networks.

Fundamental Deep Learning Algorithms To Learn

A discussion of fundamental deep learning algorithms people new to the field should learn along with a recommended course of study.

Why All The Excitement About Artificial Intelligence

Given all the recent buzz around artificial intelligence, I discuss three reasons for why we are seeing such widespread interest in the field today.

Being a Good Machine Learning Engineer/Data Scientist

A discussion of the most important skills necessary for being an effective machine learning engineer or data scientist.

A Software Interview Challenge

In light of a particularly exhausting software job interview cycle, I discuss the general format of such interviews and propose a playful challenge for any people going through a similar experience.

Review of SIGDial/SemDial 2017

Following my attendance at the 18th Annual Meeting on Discourse and Dialogue, I summarize the most promising directions for future dialogue research, as gleaned from discussions with other researchers at the conference.

A New Multi-Turn, Multi-Domain, Task-Oriented Dialogue Dataset

To help spur conversational assistant research, we release a corpus of 3,031 grounded, multi-turn dialogues in three distinct domains appropriate for an in-car assistant: calendar scheduling, weather information retrieval, and point-of-interest navigation.