This is the course website for EE 510, Mathematical Foundations of Machine Learning, Winter 2021 quarter.
Meeting time: Tue/Thu 6:40-8:30PM, Online via Zoom
Office hours: Wed 1:00-2:00PM, Thu 8:30-9:30PM (or by appointment), Online via Zoom
Course Description
The goal of this course is to move from familiarity to fluency with the use of linear algebra to solve problems in machine learning and signal processing. Through a combination of theory, modeling, and applications, students will gain intuition into the fundamentals of matrix methods and optimization. Topics covered include least squares, the singular value decomposition, eigenvalue decomposition, subspace methods, and optimization methods such as stochastic gradient descent, ADMM, and iteratively reweighted least squares. Applications will include principal components analysis, image compression and denoising, low rank matrix completion, kernel ridge regression, and spectral clustering.
While the course emphasizes mathematical analysis, there is a significant programming component that may be completed using either MATLAB or Python 3.
Textbook: This course does not require a textbook. Reference textbooks can be found in the Resources section below.
Communication: Due to the COVID-19 pandemic, this quarter’s course will be held completely online. All written questions should be posted to the appropriate channel on the Slack workspace (see Homework 0). Regular course meetings will be held via Zoom.
Course Schedule
This course will practice a flipped classroom. You are responsible for reading the listed lecture notes before class on the specified day. Please post any questions you have to the course slack channel. I will spend the first portion of class giving a mini-lecture on the most important/confusing topics, and you will work problems in groups during the rest of class.
Homework
All assignments must be submitted via gradescope to obtain credit. See Homework 0 below for information on how to set up an account.
I provide the \(\LaTeX\) file used to generate each homework below. You must use this as a template to receive extra credit.
- Homework 0, Due: January 8, 2021 (pdf) (tex)
- Homework 1, Due: January 15, 2021 (pdf) (tex)
- Homework 2, Due: January 22, 2021 (pdf) (tex) (files)
- Homework 3, Due: January 29, 2021 (pdf) (tex) (files)
- Homework 4, Due: February 7, 2021 (pdf) (tex) (files)
- Homework 5, Due: February 12, 2021 (pdf) (tex) (files)
- Homework 6, Due: February 19, 2021 (pdf) (tex) (files)
- Homework 7, Due: February 28, 2021 (pdf) (tex) (files)
- Homework 8, Due: March 5, 2021 (pdf) (tex) (files)
- Homework 9, Due: March 14, 2021 (pdf) (tex) (files)
Project
Please read the project description and template files.
- Please turn in your groups and selected topic with Homework 5.
- I will read your project writeup one time and give feedback as long as you get it to me by 11:59PM on March 12, 2020.
Resources
- \(\LaTeX\): The best way to learn is to hack examples, like those I provide for the homework assignments above. A few other good resources are below.
- tutorial
- Learn LaTeX in 30 minutes
- wikibook
- LaTeX math symbols
- Overleaf: An online LaTeX editor with a Google Docs flavor
- An easy way to include code with your \(\LaTeX\) file is via the pdfpages package.
- Recommended Textbooks: There is no required textbook for this course. The below may be helpful resources.
- Introduction to Applied Linear Algebra – Vectors, Matrices, and Least Squares
- Linear Algebra, MIT Open CourseWare
- 3 Blue 1 Brown Essence of Linear Algebra
- Matrix Perturbation Theory, G. W. Stewart and Ji-guang Sun, ISBN: 9780126702309
- Matrix Computations, Gene H. Golub and Charles F. Van Loan, ISBN: 9787115208804
- Matrix Analysis for Scientists and Engineers, Alan Laub, ISBN: 9780898715767
- The Matrix Cookbook, available via web search
- Python Resources: If you choose to code in Python, you should install Python 3 (latest stable version) using the Anaconda distribution. I also recommend making use of Jupyter notebooks, but this is not required.