MIT Introduction to Deep Learning 6.S191: Lecture 1
Foundations of Deep Learning
Lecturer: Alexander Amini

For all lectures, slides, and lab materials: http://introtodeeplearning.com/

Lecture Outline
0:00​ – Introduction
6:35 ​ – Course information
9:51​ – Why deep learning?
12:30​ – The perceptron
14:31​ – Activation functions
17:03​ – Perceptron example
20:25​ – From perceptrons to neural networks
26:37​ – Applying neural networks
29:18​ – Loss functions
31:19​ – Training and gradient descent
35:46​ – Backpropagation
38:55​ – Setting the learning rate
41:37​ – Batched gradient descent
43:45​ – Regularization: dropout and early stopping
47:58​ – Summary

Subscribe to stay up to date with new deep learning lectures at MIT, or follow us on @MITDeepLearning on Twitter and Instagram to stay fully-connected!!

Add comment

Your email address will not be published. Required fields are marked *

Categories

All Topics