This is the course website for EE 516, Mathematical Foundations of Machine Learning, Winter 2023 quarter.
Meeting time: Tue/Thu 6:40-8:30PM, FAB 150
Office hours: Wed 12:30-1:30PM Online via Zoom, Thu 8:30-9:30PM FAB 150 (or by appointment)
Course Description
The goal of this course is to move from familiarity to fluency with the use of linear algebra to solve problems in machine learning and signal processing. Through a combination of theory, modeling, and applications, students will gain intuition into the fundamentals of matrix methods and optimization. Topics covered include least squares, the singular value decomposition, eigenvalue decomposition, subspace methods, and optimization methods such as stochastic gradient descent, ADMM, and iteratively reweighted least squares. Applications will include principal components analysis, image compression and denoising, low rank matrix completion, kernel ridge regression, and spectral clustering.
While the course emphasizes mathematical analysis, there is a significant programming component that may be completed using either MATLAB or Python 3.
Textbook: This course does not require a textbook. Lecture notes will be provided via the course Slack workspace. Reference textbooks can be found in the Resources section below.
Communication: All written questions should be posted to the appropriate channel on the Slack workspace (see Homework 0).
Course Schedule
This course will practice a flipped classroom. You are responsible for reading the listed lecture notes before class on the specified day. Please post any questions you have to the course slack channel. I will spend the first portion of class giving a mini-lecture on the most important/confusing topics, and you will work problems in groups during the rest of class.
Date | Lecture | Sections | Notes | Exercises | Solutions |
---|---|---|---|---|---|
1/10 | 1 | 0.1 - 1.20 | Syllabus & Introduction, Introduction to Matrices | Exercises 1 | Solutions 1 |
1/12 | 2 | 1.21 - 1.55 | Introduction to Matrices | Exercises 2 | Solutions 2 |
1/17 | 3 | 2.1 - 2.24 | Matrix Factorizations & Decompositions | — | — |
1/19 | 4 | 2.25 - 2.29, 3.1 - 3.15 | Matrix Factorizations & Decompositions, Subspaces & Rank | Exercises 3 | Solutions 3 |
1/24 | 5 | 3.16 - 3.46 | Subspaces & Rank | Exercises 4 | Solutions 4 |
1/26 | 6 | — | Demo 1: Spectral Clustering | Demo 1 Starter Code | Demo 1 Solution Code |
1/31 | 7 | 4.1 - 4.27 | Least Squares | Exercises 5 | Solutions 5 |
2/2 | 8 | 4.28 - 4.52 | Least Squares | ||
2/7 | 9 | — | Exam 1 | ||
2/9 | 10 | 6.1 - 6.30 | Low-Rank Approximation | Exercises 6 | Solutions 6 |
2/14 | 11 | — | Demo 2: Nearest-Subspace Classifier | Demo 2 Starter Code | Demo 2 Solution Code |
2/16 | 12 | 6.31 - 6.37, 7.1 - 7.10 | Low-Rank Approximation, Optimization Basics | Exercises 7 | Solutions 7 |
2/21 | 13 | — | Demo 3: Kernel Ridge Regression | Demo 3 Starter Code | Demo 3 Solution Code |
2/23 | 14 | — | No Class | ||
2/28 | 14 | 7.11 - 7.16, 8.1 - 8.3 | Optimization Basics, Sparse Regression | Exercises 8 | Solutions 8 |
3/2 | 16 | — | Exam 2 | ||
3/7 | 15 | — | ADMM, IRLS | ||
3/9 | 18 | — | Demo 4: Robust PCA | Demo 4 Starter Code | Demo 4 Solution Code |
3/14 | 19 | — | Low-Rank Matrix Completion, Accelerated First-Order Methods | ||
3/16 | 20 | — | Demo 5: Low-Rank Matrix Completion | Demo 5 Starter Code | Demo 5 Solution Code |
Homework
All assignments must be submitted via gradescope to obtain credit. See Homework 0 below for information on how to set up an account.
I provide the \(\LaTeX\) file used to generate each homework below. You must use this as a template to receive extra credit.
- Homework 0, Due: January 13, 2023 (pdf) (tex)
- Homework 1, Due: January 20, 2023 (pdf) (tex)
- Homework 2, Due: January 27, 2023 (pdf) (tex) (files)
- Homework 3, Due: February 3, 2023 (pdf) (tex) (files)
- Homework 4, Due: February 12, 2023 (pdf) (tex) (files)
- Homework 5, Due: February 17, 2023 (pdf) (tex) (files)
- Homework 6, Due: February 24, 2023 (pdf) (tex) (files)
- Homework 7, Due: March 5, 2023 (pdf) (tex) (files)
- Homework 8, Due: March 12, 2023 (pdf) (tex) (files)
- Homework 9, Due: March 19, 2023 (pdf) (tex) (files)
Project
Please read the project description and template files.
- Please turn in your groups and selected topic with Homework 5.
- I will read your project writeup one time and give feedback as long as you get it to me by 11:59PM on March 17, 2023.
Optional Weekly Readings
Below are some completely optional, less technical readings that you may find interesting.
-
The State of Being Stuck: This is great to keep in mind as you work problems for this class.
- How to solve hard problem sets: I found this helpful when I started graduate school.
- A somewhat related article on how walking around is useful when solving problems.
- Does one have to be a genius to do maths?: Written by Terence Tao, a Fields Medal winner who has published on basically everything, including signal processing.
- I also recommend his whole section on career advice, and his whole blog in general.
- Podcast on “Deep Work”
- Cal Newport also has an interesting blog that I highly recommend.
-
Excerpt from The Art and Craft of Problem Solving: I think the high-level overview of what makes a problem (versus an excercise) can be helpful. I recommend this book as a whole for learning to think about and solve difficult problems.
- How technology is hijacking your mind
-
Interview with Cathy O’Neil, author of the book Weapons of Math Destruction
- The Unreasonable Effectiveness of Mathematics: I’m a big fan of Richard Hamming’s writing (outside of his research contributions).
- See also, You and Your Research
- A sober take on deep learning by Michael Jordan: Not the basketball player. If you want to go down a bit of a rabbit hole, some interesting related discussion is below.
- A more recent article by Michael Jordan
- Ali Rahim & Ben Recht’s Test of Time Talk: This is a talk from NIPS 2017 that sparked a fairly large amount of discussion on whether deep learning is “alchemy.”
- Yann LeCun’s response
- A really excellent YouTube series with the biggest names in machine learning
Resources
- \(\LaTeX\): The best way to learn is to hack examples, like those I provide for the homework assignments above. A few other good resources are below.
- tutorial
- Learn LaTeX in 30 minutes
- wikibook
- LaTeX math symbols
- Overleaf: An online LaTeX editor with a Google Docs flavor
- An easy way to include code with your \(\LaTeX\) file is via the pdfpages package.
- Recommended Textbooks: There is no required textbook for this course. The below may be helpful resources.
- Introduction to Applied Linear Algebra – Vectors, Matrices, and Least Squares
- Linear Algebra, MIT Open CourseWare
- 3 Blue 1 Brown Essence of Linear Algebra
- Matrix Perturbation Theory, G. W. Stewart and Ji-guang Sun, ISBN: 9780126702309
- Matrix Computations, Gene H. Golub and Charles F. Van Loan, ISBN: 9787115208804
- Matrix Analysis for Scientists and Engineers, Alan Laub, ISBN: 9780898715767
- The Matrix Cookbook, available via web search
- Python Resources: If you choose to code in Python, you should install Python 3 (latest stable version) using the Anaconda distribution. I also recommend making use of Jupyter notebooks, but this is not required.