Instructor Lingpeng Kong (lpk AT
Season Fall 2021
Location CBA, Chow Yei Ching Building
TA Qintong Li (qtli AT
Course description:

Natural language processing (NLP) is the study of human language from a computational perspective. The course will be focusing on machine learning and corpus-based methods and algorithms. We will cover syntactic, semantic and discourse processing models. We will describe the use of these methods and models in applications including syntactic parsing, information extraction, statistical machine translation, dialogue systems, and summarization. This course starts with language models (LMs), which are both front and center in natural language processing (NLP), and then introduces key machine learning (ML) ideas that students should grasp (e.g. feature-based models, log-linear models and then the neural models). We will land on modern generic meaning representation methods (e.g. BERT/GPT-3) and the idea of pretraining / finetuning.


COMP3314 or COMP3340, MATH1853


50% continuous assessment, 50% examination


Lecture     Topic/papers Recommended reading Materials
Part I    
Sept. 3 Introduction to NLP, Language Models [slides] [J&M Ch. 1] [Lee, 2004]
Sept. 7 Language Models, RNNLM [slides] [J&M Ch. 4] [J&M Ch. 7] [M. Collins, Notes 1][C. Dyer, LSTM Notes] [Assignment 1][Note]
Sept. 10 BERT, Pretraining + Fine-tuning [slides]
[BERT paper]
Sept. 14 Computational Graphs [slides] [J&M Ch. 8.1 - 8.3] [M. Collins, Notes][C. Dyer, LSTM Notes]
Sept. 17 Sequence to Sequence Model [slides] [Sutskever et al, 2014]
Sept. 21 Attention Mechanism [slides] [Baahdanau et al, 2015]
Sept. 24 Transformers 1 [slides] [Vaswani et al, 2017] [The Annotated Transformer]
Sept. 29 Transformers 2 [slides] [Zheng et al, 2021] [Xu et al, 2019]
Part II    
Oct. 5 Parsing, Context-free Grammars [slides] [M. Collins, Notes] [J&M Ch. 12]
Oct. 8 Probabilistic Context-free Grammars [slides]
Oct. 19 Recursive Neural Networks, Shift-reduce Parsing and Stack-LSTMs [slides] [Stanford Sentiment Treebank] [Socher et al, 2013] [Dyer et al, 2015] [Assignment 2] [colab link]
Oct. 22 Dependency Parsing [slides] [J&M Ch. 14]
Oct. 26 Recurrent Neural Network Grammars [slides] [Dyer et al., 2016] [Kuncoro et al., 2017]
Part III    
Nov. 2 Large Pretrained Models [slides] [BART] [T5] [InfoWord] [GPT-3] [ELMo]
Nov. 5 Prompt, Prefix-Tuning and Adaptors [slides] [Liu et al., 2021] [Li and Liang, 2021] [Houlsby, 2019]
Nov. 9 Natural Language Generation [slides] [Holtzman et al., 2019]
Nov. 12 Controllable Text Generation [slides] [Ghazvininejad et al., 2017] [Dathathri et al., 2020]
Nov. 16 Question Answering [slides] [Rajpurkar et al., 2017] [Seo et al., 2017] [J&M Ch. 23] [Joshi et al., 2020] [Lee et al., 2019]
Nov. 19 Multilinguality [slides] [Universal Dependencies] [Pires et al., 2019] [Lample and Conneau, 2019] [Liu et al., 2020] [Assignment 3]
Nov. 23 Multimodality, NLP + Vision [slides] [VQA] [Xu et al., 2015] [GQA] [Hudson and Manning, 2018]
Nov. 26 Model Interpretability [slides] [Wu et al., 2020] [Tenney et al., 2020]
Nov. 30 All Questions Answered



Please submit to moodle a zip file that contains (1) your code, (2) a write-up (pdf) that explains your model, and (3) your model’s predictions (strictly following the required format). Please name your zip file in formt


For each assignment, you can use a different programming language (e.g., Python or C++) and different deep learning frameworks (e.g., PyTorch or Tensorflow).


We will review your work individually to ensure that you receive due credit for your work. Please note that both your project output and logic will be considered for marking.

Policy and honor code:

You are free to discuss ideas and implementation details with other students. However, copying others’ codes will not help your study but jeopardize it. We will check your work against other submissions and Internet sources. It is easy to know if you did your own work or not. To be clear, we encourage you to discuss with your classmates but you MUST do your work independently and CANNOT simply copy. If plagiarism is identified, one may face serious consequences according to the Faculty and University policy.