Lectures

Here is a tentative schedule, which will likely change as the course goes on.

Suggested readings are just that: resources we recommend to help you understand the course material. They are not required, i.e. you are only responsible for the material covered in lecture.

ESL = The Elements of Statistical Learning, by Hastie, Tibshirani, and Friedman.
MacKay = Information Theory, Inference, and Learning Algorithms, by David MacKay.
Barber = Bayesian Reasoning and Machine Learning, by David Barber.
Bishop = Pattern Recognition and Machine Learning, by Chris Bishop.
Sutton and Barto = Reinforcement Learning: An Introduction, by Sutton and Barto.
Goodfellow = Deep Learning, by Goodfellow, Bengio, and Courville.

Week Topic(s) and Dates Slides & Suggested Readings Important Dates
Week 1 Introduction
Nearest Neighbours, 1/15
[Slides] [Video]

ESL: Chapters 1, 2.1-2.3, and 2.5
Metacademy: K nearest neighbors

Week 2 Decision Trees
Ensembles 1/22
[Slides] [Video]

ESL: 9.2, 2.9, 8.7, 15
Metacademy: decision trees, entropy, mutual information, bias/variance decomposition, bagging, random forests

1/19: Hw 1 out.
Week 3 Linear Regression
Linear Classifiers, 1/29
[Slides] [Video]

Bishop: 3.1, 4.1, 4.3
Course notes: linear regression, linear classifiers, logistic regression
Metacademy: linear regression, closed-form solution, gradient descent, ridge regression

Week 4 Softmax Regression
SVMs
Boosting, 2/5
[Slides] [Video]

Bishop: 7.1, 14.3
Course notes: optimization, SVMs and boosting

2/4: Hw1 due.

2/5: Hw2 out.

Week 5 Neural networks, 2/12 [Slides] [Video] Bishop: 5.1-5.3
Course notes: multilayer perceptrons, backprop
Reading Week Midterm review | Practice questions 2/18 Hw2 due.
Week 6 Convolutional Networks, 2/26 [Slides] [Video] [Simple neural net demo]

Course Notes: conv nets, image classification
Goodfellow, sections 9.1-9.5

2/24: midterm
2/26: Hw3 out.
Week 7 PCA, K-Means, Autoencoders, and
Maximum Likelihood, 3/5
[Slides] [Video]
[Latent interpolation demo 1] [Latent interpolation demo 2]

Bishop: 12.1, 9.1
Course notes: mixture models

Week 8 Intro to Generative Models, 3/12 [Slides] [Video]

Bishop: 2.1-2.3, 4.2
MacKay: chapters 21, 23, 24
Course notes: probabilistic models

Probability Theory: The Logic of Science by E. T. Jaynes
3/9 Hw3 due.
3/9 Hw4 out.
Week 9 Reinforcement Learning, 3/19 [RL Slides] [AlphaGo Slides] [Video]

Sutton and Barto: 3, 4.1, 4.4, 6.1-6.5

Week 10 Collaborative Filtering and Matrix Factorization, 3/26 [Slides] [Video]

Bishop: 9.2-9.4
Barber: 20.1-20.3
[Generating images from caption and vice versa]

3/23 Hw4 due. Project out.
Week 11 Good Friday - No class, 4/2
Week 12 Final project presentation, 4/9
Week 13 Algorithmic fairness, and advanced machine learning courses. Monday, April 12th [Slides] [Video]

Tech Review article on fairness tradeoffs Chapters 1 and 2.

Barocas, Hardt, and Narayanan. Fairness and Machine Learning. Chapters 1 and 2.

Zemel et al., 2013. Learning fair representations.

Louizos et al., 2015. The variational fair autoencoder.

Hardt et al., 2016. Equality of opportunity in supervised learning.

Homeworks

Most homeworks will be due on Thursdays at 11:59pm. You will submit through [MarkUs].

Out Due Materials Starter Code
Homework 1 1/19 2/4 [Handout]
[Code]
Homework 2 2/5 2/18 [Handout]
[q1.py] [q2.py]
Homework 3 2/26 3/9 [Handout]
Homework 4 3/10 3/23 [Handout]
[Code]

Midterm

The midterm exam will be a take-home one. The exam will be distributed online, synchronously, and you have 24 hours to complete it offline.

Final project

The final project will replace what used to be a final exam.
[Project Instructions and rubrics] [Starter Code and Data for Default Project]