In this video, we get to uncover the fundamental building block of BERT’s ability to understand language! It’s a mechanism called “Self-Attention”.

We’ll again be relying heavily on the excellent illustrations in Jay Alammar’s post, “The Illustrated Transformer”: http://jalammar.github.io/illustrated-transformer/.

==== Full Series ====
The Bert Research Series is complete! All 8 Episodes are up:
https://www.youtube.com/playlist?list=PLam9sigHPGwOBuH4_4fr-XvDbe5uneaf6

==== Updates ====
Sign up to hear about new content across my blog and channel: https://www.chrismccormick.ai/subscribe

Add comment

Your email address will not be published. Required fields are marked *

Categories

All Topics