==== Overview ====
We’re going to continue the BERT Research series by digging into the architecture details and “inner workings” of BERT. We’ll be following Jay Alammar’s excellent post, “The Illustrated Transformer”: http://jalammar.github.io/illustrated-transformer/
==== References ====
Here is the video I reference which provides a simple explanation of Neural Machine Translation (prior to the Transformer):
https://www.youtube.com/watch?v=AIpXjFwVdIE
==== Updates ====
Sign up to hear about new content across my blog and channel: https://www.chrismccormick.ai/subscribe
Add comment