Newer version: Fall 2017.
This course provides a unifying introduction to probabilistic modelling through the framework of graphical models, together with their associated learning and inference algorithms.
Teacher: Simon Lacoste-Julien, Office: 3339 André-Aisenstadt
Office hours: Friday 15h30-16h30
TA: Sarath Chandar, Office: 3336 André-Aisenstadt
Office hours: Tuesday 16h30-17h30
Fridays 13h30-15h30 - Z-200 Pav. Claire-McNicoll
Tuesdays 14h30-16h30 - 3195 Pav. Andre-Aisenstadt
Uncertainty about week of November 21st and 28th (to be discussed)
Probability review
Maximum likelihood estimation
Linear regression, logistic regression, Fisher discriminant
K-means, EM, Gaussian mixtures
Directed and undirected graphical models
Exponential family, information theory
Gaussian networks
Factor analysis
Sum-product algorithm, HMM, junction tree
Approximate inference: sampling, variational methods
Estimation of parameters in graphical models
Bayesian methods
Model selection
Homework (50%) – about 5 | homework logistics below
Project (30%) – project report to hand in + poster presentation on Dec 13th | detailed info about projects
Final exam (20%) – take-home exam, after poster presentation
The prerequesites are previous coursework in linear algebra, multivariate calculus, and basic probability and statistics. There will be programming for the assignments, so familiarity with some matrix-oriented programming language will be useful (no specific language required; examples include Matlab/Octave, Python with numpy, etc.)
The course will follow the (unpublished) manuscript An Introduction to Probabilistic Graphical Models by Michael I. Jordan that will be made available to the students (but do not distribute!).
Supplementary reference: Probabilistic Graphical Models: Principles and Techniques by Daphne Koller and Nir Friedman.
The homework is to be handed in on paper form at the beginning of the class (Tuesday) on the due date (derivations | proofs | report). The code (in the language of your choice) is submitted on Studium as a zipped file (or tar.gz) with a README explaining to the TA how to run it | test it.
Collaboration policy: you can collaborate with colleagues while working on the homework, but you need to write your own independent write-up. And if you have collaborated with others on a question, you need to credit the help of your colleagues by specifying them in the write-up (proper acknowledgment is good practice for you have for academia later).
Late homework policy:
You have a budget of 6 late days that you can spend on the 5 homework. To use these days, you need to declare it by writing it on the late homework when you hand it in.
To hand in a homework late, you need to drop it in the designated box in MILA's administrative assistant office in 3255 André-Aisenstadt during business hours. You also need to send an email to the TA when this is done so that he is aware of your late homework.
The late day penalty will be the following (as deadlines are on Tuesday):
handed in Tuesday after beginning of class: 10% penalty
handed in Wednesday: 20% penalty
handed in Thursday: 40% penalty
handed in Friday: 80% penalty
handed in later: you won't get any credit (or you have to have used some of your late days – contact the TA in this case)
Below is a draft detailed outline that will be updated as the class goes on. The related chapters in Mike's book are given (but note that they do not exactly correspond with the class content). Related past scribe notes from the class that I taught in Paris are given for now, and will be updated with the scribe notes as I get them.
Date | Topics | Related chapters Scribbled notes | Scribe notes | Homework milestones |
Sept 2 | Set-up & overview | lecture1 | ||
Sept 6 | Probability review | 2.1.1 lecture2 | William Léchelle lecture2.pdf source | |
Sept 9 | Parametric models Frequentist vs. Bayesian | 5 lecture3 | ||
Sept 13 | Maximum likelihood Statistical decision theory | lecture4 | MVA lecture1 | Hwk 1 (updated version) out |
Sept 16 | Properties of estimators | lecture5 | ||
Sept 20 | Linear regression Logistic regression | 6, 7 lecture6 | MVA lecture2 | |
Sept 23 | Logistic regression Optimization Kernel trick | 7 lecture7 | ||
Sept 27 | Generative classification (Fisher) Derivative tricks for Gaussian MLE K-means | 10 lecture8 | MVA lecture3 | Hwk 1 due Hwk 2 out data.zip |
Sept 30 | Gaussian mixtures EM | 10, 11 lecture9 | ||
Oct 4 | Graph theory Directed graphical models | 2 lecture10 | MVA lecture4 | |
Oct 7 | Directed graphical models (cont.) Undirected graphical models | 2 lecture11 | ||
Oct 11 | Undirected graphical models (cont.) Inference: elimination algorithm | 2, 3 lecture12 | Hwk 2 due Hwk 3 out data.zip | |
Oct 14 | Inference: sum-product alg. (Inference: junction tree alg.) | 4, (17) lecture13 | MVA lecture7 | |
Oct 18 | HMM and EM | 12 lecture14 | ||
Oct 21 | Exponential families Information theory | 8, 19 lecture15 | MVA lecture5 | |
Oct 25 | Exponential families (cont.) duality | 19 lecture16 | MVA lecture6 | Hwk 3 due |
Oct 28 | Gaussian networks | 13 lecture17 | ||
Nov 1 | Cancelled | |||
Nov 4 | Factor analysis, PCA, CCA (Kalman filter) | 14 , (15) lecture18 | ||
Nov 8 | Sampling | 21 lecture19 | MVA lecture8 | Hwk 4 out |
Nov 11 | Sampling (cont.) MCMC | 21 lecture20 | Project group formed | |
Nov 15 | Variational methods | lecture21 | ||
Nov 18 | Estimation in graphical models Bayesian methods Model selection | 9 5 26 lecture22 | MVA lecture10 | |
Nov 22 | Non-parametric models | 25 | Hwk 4 due | |
Nov 25 | Guest lecture: Aaron Courville | Hwk 5 out | ||
Nov 29 | No lecture | Project: 1 page progress report due | ||
Dec 6 | No lecture this week work on your project! | |||
Dec 13 | Poster presentation 1pm–3:30pm in AA6225 | Take-home final out | ||
Dec 20 | Project report due Take-home final due Hwk 5 due | |||
Last modified: 2016-12-12 23h30