How Alexa Dropped the Ball on Being the Top Conversational System on the Planet
I discuss why Alexa missed the opportunity to take the lead and become the dominant player in the conversational AI market.
I discuss why Alexa missed the opportunity to take the lead and become the dominant player in the conversational AI market.
I discuss the messy state of MLOps today and how we are still in the early phases of a broader transformation to bring machine learning value to enterprises globally.
After analyzing 1000+ Y-Combinator Companies, I discover there's a huge market need for more engineering-focused data practitioner roles.
I describe in-depth the courses necessary for a 4-year undergraduate degree in artificial intelligence, assuming you step onto campus tomorrow.
Following an eventful week at NeurIPS 2019, I summarize some key trends in machine learning today.
In which I describe how I built a deep learning computer starting from nothing but a pile of hardware components and a dream.
I provide an aggregated and comprehensive list of tutorials I have made for fundamental concepts in machine learning.
I discuss market basket analysis, an unsupervised learning technique for understanding and quantifying the relationships between sets of items.
I discuss decision trees which are very powerful, general-purpose models that are also interpretable.
I discuss the k-nearest neighbors algorithm, a remarkably simple but effective machine learning model.
In which I describe commonly used techniques for evaluating machine learning models.
I discuss strategies such as cross-validation which are used for selecting best-performing machine learning models.
In which I discuss the technique of ensembling which is used to improve the performance of a single machine learning model by combining the power of several other models.
In which I discuss regularization, a strategy for controlling a model's generalizability to new datasets.
In which I give a primer on principal components analysis, a commonly used technique for dimensionality reduction in machine learning.
In which I discuss feature selection which is used in machine learning to help improve model generalization, reduce feature dimensionality, and do other useful things.
In which I describe the bias-variance tradeoff, one of the most important concepts underlying all of machine learning theory.
Following an eventful week at ACL 2019, I summarize key trends and takeaways to look for in NLP.
In which we investigate K-means clustering, a common unsupervised clustering technique for analyzing data.
I describe recurrent neural networks, a deep learning technique geared for tasks such as natural language processing.
We discuss convolutional neural networks, a deep learning model inspired by the human visual system that has rocked the state-of-the-art in computer vision tasks.
We investigate a motley of interesting properties and goodies of neural networks.
We begin a deep dive into deep learning by investigating feedforward neural networks.
We discuss support vector machines, a very powerful and versatile machine learning model.
I describe Naive Bayes, a commonly-used generative model for a variety of classification tasks.
I describe logistic-regression, one of the cornerstone algorithms of the modern-day machine learning toolkit.
I describe the basics of linear regression, one of the most common and widely used machine learning techniques.
We discuss the Transformer, a purely attention-based architecture that is more performant, more efficient, and more parallelizable than recurrent network-based models.
I describe ELMo, a recently released set of neural word representations that are pushing the state-of-the-art in natural language processing pretraining methodology.
A discussion of fundamental deep learning algorithms people new to the field should learn along with a recommended course of study.
Given all the recent buzz around artificial intelligence, I discuss three reasons for why we are seeing such widespread interest in the field today.
A discussion of the most important skills necessary for being an effective machine learning engineer or data scientist.
Following my attendance at the 18th Annual Meeting on Discourse and Dialogue, I summarize the most promising directions for future dialogue research, as gleaned from discussions with other researchers at the conference.