EE 518, Spring 2024

Machine Learning Theory & Algorithms

This is the course website for EE 510, Machine Learning Theory & Algorithms, Spring 2024 quarter.

Meeting time: Mon/Wed 11:30AM-1:20PM, Engineering Buliding 102
Office hours: Wed 1:30-2:30PM (location TBD), Thur 3:00-4:00PM (Zoom only)

Course Description

The goal of this course is to provide a thorough understanding of the fundamental methodologies and algorithms used in machine learning. Students will learn to understand, implement, and innovate on algorithms for common tasks such as classification, regression, clustering, and dimensionality reduction. Algorithms covered include linear and nonlinear regression, ensemble methods, support vector machines, K-means, hierarchical clustering, and Gaussian mixture models. Theory covered will include empirical risk minimization, generalization bounds, bias-variance tradeoff, PAC learning, and VC dimension.

Students should have a firm understanding of linear algebra and probability, at the level of EE 516: Mathematical Foundations of Machine Learning and EE 520: Random Processes, respectively. Students should also be comfortable with programming in Python.

Textbook: The course will utilize the free textbooks below.

Syllabus: The syllabus can be found here.

Communication: All written questions should be posted to the appropriate channel on the Slack workspace (see Homework 0).

Course Schedule

This course will practice a flipped classroom. You are required to do the readings from Understanding Machine Learning (UML) below before the corresponding class time. Class time will be used as discussion of topics that students found most confusing or difficult. Readings from The Elements of Statistical Learning (ESL) are optional, and assignments are typically due Fridays at 11:59PM.

Date Lecture Topic UML Pages ESL Pages Assignment Due (Friday)
4/1 1 introduction, empirical risk minimization 19-29, 33-41 9-38
4/3 2 PAC learning 43-50 HW0
4/8 3 uniform convergence 54-58
4/10 4 nearest neighbor rules 258-265 463-480 HW1
4/15 5 bias-complexity tradeoff 60-66 219-229
4/17 6 VC-dimension 67-78 237-240 HW2
4/22 7 (catch up)
4/24 8 boosting 130-142 337-380 MP1
4/29 9 (catch up)
5/1 10 decision trees 250-256 295-334 HW3
5/6 11 model selection 144-154 241-256
5/8 12 convex learning 156-169 HW4
5/13 13 (catch up)
5/15 14 regularization & stability 171-181 61-72 MP2
5/20 15 support vector machines 202-213 417-438
5/22 16 (catch up) HW5
5/27 17 NO CLASS HW6
5/29 18 kernels 215-224 417-438
6/3 19 kernels 342-355 261-270
6/5 20 clustering and generative models 307-320 501-527 MP3

Assignments

All assignments must be submitted via gradescope to obtain credit. See Homework 0 below for information on how to set up an account.

I provide the \(\LaTeX\) file used to generate each homework below. You must use this as a template to receive extra credit.

Resources