IFT6289, Winter 2024

Natural Language Processing with Deep Learning



Time and Place

  • Mondays: 8:30am - 10:29am (1355 Pav. AndrĂ©-Aisenstadt)
  • Wednesdays: 8:30am - 10:29am (1355 Pav. AndrĂ©-Aisenstadt)
Please check the Studium platform for the link to Zoom and Slack workspace of this course.
Note that the lectures will be recorded. The link to each recorded lecture will also be posted on Studium after class.

Instructor

Bang Liu
Email: firstname.lastname@umontreal.ca
Teaching Assistant: Suyuchen Wang (firstname.lastname@umontreal.ca), Rushil Gupta (firstname.lastname@umontreal.ca)

Course description

Natural Language Processing (NLP) is a branch of artificial intelligence that deals with the interaction between computers and humans using natural language. It is one of the most important technologies of the information age and is used everywhere: search engines, advertising, chatbots, language translation, virtual agents, and so on. Deep Learning approaches have obtained very high performance across many different NLP tasks in recent years. In this course, students will gain a thorough introduction to the basics of NLP, as well as cutting-edge research in Deep Learning for NLP. We will focus on modern techniques for NLP, as well as introduce the applications in our daily lives. Students are encouraged to do some pretty cool research projects based on NLP techniques, e.g., writing poetry, detecting spam emails, building chatbots, machine reading comprehension, and so on. Through lectures, assignments, and a term project, students will learn the necessary skills to design, implement, and understand their own models for NLP tasks.

Prerequisites

  • Proficiency in Python
    All class assignments will be in Python (e.g., using NumPy and PyTorch).
  • College Calculus, Linear Algebra
    You should be comfortable taking (multivariable) derivatives and understanding matrix/vector notation and operations.
  • Basic Probability and Statistics
    You should know the basics of probabilities, gaussian distributions, mean, standard deviation, etc.
  • Foundations of Machine Learning
    If you already have basic machine learning and/or deep learning knowledge, the course will be easier.

Reference Texts

No textbook is required. But the following texts that can be read free online are helpful.

Marking scheme

  • Readings (10%):
    7 reading assignments worth 1% each, one mini-lecture presentation worths 3%.
  • Assignments (45%):
    3 programming and theoretical assignments worth 15% each
  • Term project (45%):
    Project proposal (up to 2 pages): 5%
    Midway report (up to 4 pages): 5%
    Final presentation: 5%
    Final report (up to 8 pages): 30%
  • Bonus: class participation (2%):
    Students are encouraged to actively participate in the class, slack, etc.
Late policy:
A late day extends the deadline 24 hours. For ALL assignments, submissions after 2 late days (48 hours) of the deadline won't be accepted.
For programming assignments, we deduct 2% for each late day. We don't count hours, e.g., if you submit an assignment after 25 hours, it will be considered as 2 late days and will be deducted 4%.
For project proposal, midway report, we deduct 1% for each late day.
For project final report, we deduct 3% for each late day.
No late day for the final project presentation and reading assignments.

Tentative Course Plan

Note: tentative schedule is subject to change.
W 2s (or 2e) means the start of the 2nd week (or the end of the 2nd week). Similar for others.
Index Topic Materials
Section I: Introduction / background
Topic 1 (W 1s) Introduction to NLP
Topic 2 (W 1e, 2s) Neural Networks and Backpropagation
Section II: NLP core techniques
Topic 3 (W 2e) Language Modeling and Recurrent Neural Networks
Topic 4 (W 3s) Word Meaning and Word Embedding
Topic 5 (W 3e) Sentence Embeddings, Convolutional Neural Networks
Topic 6 (W 4s, 4e) Machine Translation, Sequence to Sequence, and Attention
Topic 7 (W 5s, 5e) Transformer, BERT, and GPTs
Topic 8 (W 6s, 6e) Prompting, In-context Learning, RLHF
Topic 9 (W 7s, 7e) Develop Deep Learning Models with PyTorch & DL Tricks
Topic 10 (W 8, 10, 11) More on Pre-trained Language Models (student mini-lectures)
Section III: Cutting-edge research topics.
Topic 11 (W 12) Graph Neural Networks, Graph for NLP
Topic 12 (W 13) To be decided...
Topic 13 (W 13) To be decided...
Topic 14 (W 14) To be decided...
Topic 15 (W 15, 16) Final Project Presentations

Resources

Software

  • PyTorch an open-source deep learning library.
  • DGL an open-source library for deep learning on graphs.
  • HuggingFace Transformers an open-source library containing PyTorch and Tensorflow implementations, pre-trained model weights, usage scripts and conversion utilities for a variety of pre-trained language models.