Instructor Lingpeng Kong (lpk AT cs.hku.hk)
Season Fall 2024
Location Mondays, 7pm - 10pm @ KB-223 (Knowles Bldg)
TA Qintong Li, Qiushi Sun
Course description:

Natural language processing (NLP) is the study of human language from a computational perspective. The course will be focusing on machine learning and corpus-based methods and algorithms. We will cover syntactic, semantic and discourse processing models. We will describe the use of these methods and models in applications including syntactic parsing, information extraction, statistical machine translation, dialogue systems, and summarization. This course starts with language models (LMs), which are both front and center in natural language processing (NLP), and then introduces key machine learning (ML) ideas that students should grasp (e.g. feature-based models, log-linear models and then the neural models). We will land on modern generic meaning representation methods (e.g. BERT/GPT) and the idea of pretraining / finetuning.

Prerequisites:

COMP3314 or COMP3340, MATH1853

Assessment:

50% continuous assessment, 50% course project

Schedule

Lecture     Topic/papers Recommended reading Materials
1 - 2 Introduction to NLP, Language Models [slides] [J&M Ch. 1] [Lee, 2004] [J&M Ch. 4] [J&M Ch. 7] [M. Collins, Notes 1] [Course Project]
3 Computational Graphs / RNN Language Models [slides] [J&M Ch. 8.1 - 8.3] [M. Collins, Notes]
4 Sequence to Sequence Model and Attention [slides] [Sutskever et al, 2014][Baahdanau et al, 2015] [Holtzman et al., 2019] [Hyung Won Chung et al., 2022]
5 Transformers [slides] [Vaswani et al, 2017] [The Annotated Transformer] [Assignment 1]
6 Regular Expression and Context-free Grammars [slides] [J&M Ch. 2][M. Collins, Notes] [J&M Ch. 12]
7 Shift-reduce Parsing, Recursive Neural Networks, Recurrent Neural Network Grammars [slides] [Stanford Sentiment Treebank] [Socher et al, 2013] [Dyer et al, 2015] [Dyer et al., 2016] [Kuncoro et al., 2017]
8 Building Large Language Models [slides]
DLC AlphaGo [slides] [Silver et al, 2016]
9 Text Diffusion Models, Multimodal LLMs [slides] [RDM] [DiffuSeq [VLFeedback] [G-LLaVA] [Assignment 2]

Assignments

Evaluation:

We will review your work individually to ensure that you receive due credit for your work. Please note that both your project output and logic will be considered for marking.

Policy and honor code:

You are free to discuss ideas and implementation details with other students. However, copying others’ codes will not help your study but jeopardize it. We will check your work against other submissions and Internet sources. It is easy to know if you did your own work or not. To be clear, we encourage you to discuss with your classmates but you MUST do your work independently and CANNOT simply copy. If plagiarism is identified, one may face serious consequences according to the Faculty and University policy.