This course provides a unifying introduction to statistical modeling of multidimensional data through the framework of probabilistic graphical models, together with their associated learning and inference algorithms.
Teacher: Simon Lacoste-Julien, Office: 3339 André-Aisenstadt
Office hours: Friday 15h30-16h30
TA: Sarath Chandar, Office: 3336 André-Aisenstadt
Office hours: Tuesday 16h30-17h30
Tuesday 14h30-16h30 - S-142 Pav. Roger-Gaudry (except Oct 10th!)
Friday 13h30-15h30 - Z-317 Pav. Claire-McNicoll
Probability review
Maximum likelihood estimation
Linear regression, logistic regression, Fisher discriminant
K-means, EM, Gaussian mixtures
Directed and undirected graphical models
Exponential family, information theory
Gaussian networks
Factor analysis
Sum-product algorithm, HMM, junction tree
Approximate inference: sampling, variational methods
Estimation of parameters in graphical models
Bayesian methods
Model selection
Homework (50%) – about 5 | homework logistics below
Project (30%) – project report to hand in + poster presentation on Dec 12th | detailed info about projects
Final exam (20%) – take-home exam, after poster presentation
The prerequisites are previous coursework in linear algebra, multivariate calculus, and basic probability and statistics. There will be programming for the assignments, so familiarity with some matrix-oriented programming language will be useful (no specific language required; examples include Matlab/Octave, Python with numpy, etc.)
The course will follow the (unpublished) manuscript An Introduction to Probabilistic Graphical Models by Michael I. Jordan that will be made available to the students (but do not distribute!).
Supplementary references:
For very detailed and rigorous reference: Probabilistic Graphical Models: Principles and Techniques by Daphne Koller and Nir Friedman. Referred as KF in outline below.
See Part I of the Deep Learning book by Ian Goodfellow, Yoshua Bengio and Aaron Courville for a very gentle review of applied maths useful for this class. Chapter 5 contains a useful presentation of machine learning basics. Referred as DL in outline below.
Another classical book, with a more Bayesian perspective than Mike's book, but at least completed, is Pattern Recognition and Machine Learning, by Chris Bishop.
The homework is to be handed in on paper form at the beginning of the class (Tuesday) on the due date (derivations | proofs | report | graphs). The code (in the language of your choice) is submitted on Studium as a zipped file (or tar.gz) with a README explaining to the TA how to run it | test it.
Collaboration policy: you can collaborate with colleagues while working on the homework, but you need to write your own independent write-up. And if you have collaborated with others on a question, you need to credit the help of your colleagues by specifying them in the write-up (proper acknowledgment is good practice for you have for academia later).
Late homework policy:
You have a budget of 6 late days that you can spend on the 5 homework. To use these days, you need to declare it by writing it on the late homework when you hand it in.
To hand in a homework late, you need to drop it in the designated box in MILA's administrative assistant office in 3245 André-Aisenstadt during business hours. You also need to send an email to the TA when this is done so that he is aware of your late homework.
The late day penalty will be the following (as deadlines are on Tuesday):
handed in Tuesday after beginning of class: 10% penalty
handed in Wednesday: 20% penalty
handed in Thursday: 40% penalty
handed in Friday: 80% penalty
handed in later: you won't get any credit (or you have to have used some of your late days – contact the TA in this case)
Below is a draft detailed outline that will be updated as the class goes on. For now it is the outline recopied from the Fall 2016 version of this class, with the links to the relevant old scribbled notes that will be updated with the new ones gradually. The related chapters in Mike's book are given (but note that they do not exactly correspond with the class content), and also sometimes pointers to the Koller and Friedman book (KF), the Deep Learning book (DL) or the Bishop's book (B). Related past ‘‘scribe notes’’ from the class that I taught in Paris are given for now, and will be updated with the scribe notes as I get them (if I get some).
Date | Topics | Related chapters Scribbled notes | Scribe notes | Homework milestones |
Sept 5 | Set-up & overview | intro slides lecture1 | Isabela Albuquerque lecture1.pdf source | |
Sept 8 | Probability review | 2.1.1 DL: 3 (nice and gentle) KF: 2.1 (more rigorous) lecture2 | William Léchelle (Fa16) lecture2.pdf source | |
Sept 12 | Parametric models Frequentist vs. Bayesian | 5 lecture3 | Philippe Brouillard and Tristan Deleu lecture3.pdf source | Hwk 1 out |
Sept 15 | Bayesian (cont.) Maximum likelihood | lecture4 | Philippe Brouillard and Tristan Deleu lecture4.pdf | |
Sept 19 | Statistical decision theory | 1.3 in Bickel & Doksum lecture5 | Sébastien Lachapelle lecture5.pdf source | |
Sept 22 | Properties of estimators Linear regression | 6 lecture6 | Zakaria Soliman (Fa16) lecture6.pdf source | |
Sept 26 | Linear regression (cont.) Logistic regression | DL: 7.1 (l2, l1-reg.) 6, 7 lecture7 | MVA lecture2 | Hwk 1 due Hwk 2 out data.zip |
Sept 29 | Optimization Gen. classification (Fisher) Derivative tricks for Gaussian MLE | DL: 4.3 Boyd's book Matrix Diff. book lecture8 | ||
Oct 3 | Kernel trick K-means Gaussian mixtures EM | 10, 11 lecture9 | MVA lecture3 | |
Oct 6 | GMM and EM Graph theory | 10,11 2 lecture10 | MVA lecture4 | |
Oct 10 Today: room Z-305 McNicoll | Directed graphical models | 2 lecture11 | Hwk 2 due Hwk 3 out data.zip | |
Oct 13 | DGM (cont.) Undirected graphical models | 2, 3 lecture12 | ||
Oct 17 | Inference: elimination alg. sum-product alg. | 4 lecture13 | MVA lecture7 | |
Oct 20 | Max-product junction tree HMM | 17, 12 lecture14 | ||
Oct 24 | Break: look at projects | |||
Oct 27 | Break: look at projects | |||
Oct 31 | HMM and EM Information theory | 12, 19 lecture15 | MVA lecture5 | Hwk 3 due Hwk 4 out |
Nov 3 | Max entropy Duality | 19 lecture16 | MVA lecture6 | |
Nov 7 | Exponential families Sampling | 8 lecture17 21 | MVA lecture8 | Project: team formed |
Nov 10 | Sampling (cont.) | 21 lecture18 | ||
Nov 14 | MCMC sampling | 21 lecture19 | ||
Nov 17 (Sarath lecture) | Non-parametric models: Gaussian processes Dirichlet processes | 25 lecture20 DP slides | ||
Nov 21 | Gibbs sampling (cont.) Variational methods | Bishop: 10.1 lecture21 | Hwk 4 due Hwk 5 out | |
Nov 24 | Variational methods (cont.) Estimation in graphical models | 9 lecture22 | ||
Nov 28 | Bayesian methods Model selection | 5, 26 lecture23 | MVA lecture10 | Project: 1 page progress report due |
Dec 1 Note: starting at 12:30pm | Gaussian networks Factor analysis, PCA, CCA (Kalman filter) VAE | lecture24 13 old lecture17 14 , (15) old lecture18 VAE – DL: 20.10.3 | ||
Dec 5 | No lecture this week work on your project! | |||
Dec 12 | Poster presentation 1:30pm-4:30pm mezzanine of Jean-Coutu atrium | Hwk 5 due Take-home final out | ||
Dec 20 | Project report due Take-home final due | |||