MIT Introduction to Deep Learning 6.S191: Lecture 1
Foundations of Deep Learning
Lecturer: Alexander Amini

For all lectures, slides, and lab materials: http://introtodeeplearning.com/

Lecture Outline
0:00​ – Introduction
4:48 ​ – Course information
10:18​ – Why deep learning?
12:28​ – The perceptron
14:42​ – Activation functions
17:48​ – Perceptron example
21:43​ – From perceptrons to neural networks
27:42​ – Applying neural networks
30:21​ – Loss functions
33:23​ – Training and gradient descent
38:05​ – Backpropagation
43:06​ – Setting the learning rate
47:17​ – Batched gradient descent
49:49​ – Regularization: dropout and early stopping
55:55​ – Summary

Subscribe to stay up to date with new deep learning lectures at MIT, or follow us on @MITDeepLearning on Twitter and Instagram to stay fully-connected!!

Add comment

Your email address will not be published. Required fields are marked *

Categories

All Topics