Meeting time: Mon/Weds 6:40-8:30 PM, Engineering Building 102
Office hours: Mon/Weds 5:30-6:30 PM (immediately before lecture), FAB 85-03
This course provides an introduction to the theory and practice of deep learning, with an emphasis on deep neural network-based approaches. You will gain a strong understanding of the principles of machine learning through the lens of these networks. You will get to know the most prominent models, such as convolutional and recurrent neural networks, along with topics that are the subject of current research, such as representation learning and deep generative models. As a student, you can expect to learn the concepts, methods, and techniques necessary to put deep learning to work in modern applications.
- Here (pdf)
All assignments must be submitted via gradescope to obtain credit. Information on how to set up an account is included in Homework 0.
I provide the file used to generate each homework below. Please feel free to use this as a starting point for typing up your assignment.
- Homework 0, Due: April 8, 2019 (pdf) (tex)
- Homework 1, Due: April 16, 2019 (pdf) (tex)
- Homework 2, Due: April 23, 2019 (pdf) (tex)
The project description and template will be available here. I will read your project description once and give feedback as long as you get it to me by 11:59PM on May 31, 2019.
Files for Python Tutorials
- Tutorial 1 PLA (ipynb)
- Tutorial 2 Linear Regression (ipynb)
- Tutorial 3 Logistic Regression (ipynb)
- Tutorial 4 MLP using SGD on MNIST (ipynb)
Recommended Weekly Readings
Each week, we will post readings and other media that complement the lecture material. [TAGS] are acronyms for textbooks listed under Resources below. Most of this content is available online at no cost. These are strongly recommended, but not required. Feel free to skip topics familiar topics; otherwise dive in!
- Week 1: LFD (ch. 1 on PLA), VMLS (ch. 3 on norms and inner products), PDSH (ch. 2 on Numpy; ch. 4 on Matplotlib), Getting started with conda, Jupyter Notebook: An Introduction, and Deep Learning with PyTorch: A 60 Minute Blitz
- Week 2: LFD (ch. 1.3 on feasibility of learning, 1.4 on error & noise, 3.2 on linear regression, 3.3 on gradient descent, maximum likelihood estimation, logistic regression), DLB (ch. 4 on grad descent, ch. 5 on MLE), VMLS (ch. 12 on least squares), PDSH (ch. 5 in depth on linear regression)
- Week 3: LFD (ch. 2.1-2.1 on the theory of generalization, e-chapter 7.1-7.3 on forward prop/backprop), video: "What is backpropagation really doing?", Deep Learning with PyTorch: A 60 Minute Blitz (first 3 sections)
- Textbooks that are not required, but may prove quite helpful:
- Online resources (tutorials, videos, etc.) of interest: