EE 516, Winter 2023

Mathematical Foundations of Machine Learning

This is the course website for EE 516, Mathematical Foundations of Machine Learning, Winter 2023 quarter.

Meeting time: Tue/Thu 6:40-8:30PM, FAB 150
Office hours: Wed 12:30-1:30PM Online via Zoom, Thu 8:30-9:30PM FAB 150 (or by appointment)

Course Description

The goal of this course is to move from familiarity to fluency with the use of linear algebra to solve problems in machine learning and signal processing. Through a combination of theory, modeling, and applications, students will gain intuition into the fundamentals of matrix methods and optimization. Topics covered include least squares, the singular value decomposition, eigenvalue decomposition, subspace methods, and optimization methods such as stochastic gradient descent, ADMM, and iteratively reweighted least squares. Applications will include principal components analysis, image compression and denoising, low rank matrix completion, kernel ridge regression, and spectral clustering.

While the course emphasizes mathematical analysis, there is a significant programming component that may be completed using either MATLAB or Python 3.

Textbook: This course does not require a textbook. Lecture notes will be provided via the course Slack workspace. Reference textbooks can be found in the Resources section below.

Communication: All written questions should be posted to the appropriate channel on the Slack workspace (see Homework 0).

Course Schedule

This course will practice a flipped classroom. You are responsible for reading the listed lecture notes before class on the specified day. Please post any questions you have to the course slack channel. I will spend the first portion of class giving a mini-lecture on the most important/confusing topics, and you will work problems in groups during the rest of class.

Date Lecture Sections Notes Exercises Solutions
1/10 1 0.1 - 1.20 Syllabus & Introduction, Introduction to Matrices Exercises 1 Solutions 1
1/12 2 1.21 - 1.55 Introduction to Matrices Exercises 2 Solutions 2
1/17 3 2.1 - 2.24 Matrix Factorizations & Decompositions
1/19 4 2.25 - 2.29, 3.1 - 3.15 Matrix Factorizations & Decompositions, Subspaces & Rank Exercises 3 Solutions 3
1/24 5 3.16 - 3.46 Subspaces & Rank Exercises 4 Solutions 4
1/26 6 Demo 1: Spectral Clustering Demo 1 Starter Code Demo 1 Solution Code
1/31 7 4.1 - 4.27 Least Squares Exercises 5 Solutions 5
2/2 8 4.28 - 4.52 Least Squares    
2/7 9 Exam 1    
2/9 10 6.1 - 6.30 Low-Rank Approximation Exercises 6 Solutions 6
2/14 11 Demo 2: Nearest-Subspace Classifier Demo 2 Starter Code Demo 2 Solution Code
2/16 12 6.31 - 6.37, 7.1 - 7.10 Low-Rank Approximation, Optimization Basics Exercises 7 Solutions 7
2/21 13 Demo 3: Kernel Ridge Regression Demo 3 Starter Code Demo 3 Solution Code
2/23 14 No Class    
2/28 14 7.11 - 7.16, 8.1 - 8.3 Optimization Basics, Sparse Regression Exercises 8 Solutions 8
3/2 16 Exam 2    
3/7 15 ADMM, IRLS    
3/9 18 Demo 4: Robust PCA Demo 4 Starter Code Demo 4 Solution Code
3/14 19 Low-Rank Matrix Completion, Accelerated First-Order Methods    
3/16 20 Demo 5: Low-Rank Matrix Completion Demo 5 Starter Code Demo 5 Solution Code

Homework

All assignments must be submitted via gradescope to obtain credit. See Homework 0 below for information on how to set up an account.

I provide the \(\LaTeX\) file used to generate each homework below. You must use this as a template to receive extra credit.

Project

Please read the project description and template files.

Optional Weekly Readings

Below are some completely optional, less technical readings that you may find interesting.

  1. The State of Being Stuck: This is great to keep in mind as you work problems for this class.

  2. How to solve hard problem sets: I found this helpful when I started graduate school.
    • A somewhat related article on how walking around is useful when solving problems.
  3. Does one have to be a genius to do maths?: Written by Terence Tao, a Fields Medal winner who has published on basically everything, including signal processing.
    • I also recommend his whole section on career advice, and his whole blog in general.
  4. Podcast on “Deep Work”
    • Cal Newport also has an interesting blog that I highly recommend.
  5. Excerpt from The Art and Craft of Problem Solving: I think the high-level overview of what makes a problem (versus an excercise) can be helpful. I recommend this book as a whole for learning to think about and solve difficult problems.

  6. How technology is hijacking your mind
  7. Interview with Cathy O’Neil, author of the book Weapons of Math Destruction

  8. The Unreasonable Effectiveness of Mathematics: I’m a big fan of Richard Hamming’s writing (outside of his research contributions).
  9. A sober take on deep learning by Michael Jordan: Not the basketball player. If you want to go down a bit of a rabbit hole, some interesting related discussion is below.
  10. A really excellent YouTube series with the biggest names in machine learning

Resources