In the final video of the series, we’ll look in detail at the BERT pre-training tasks: the “Masked Language Model” and “Next Sentence Prediction”, which are what really set it apart from prior models.

We’ll finish by looking at how the OpenAI GPT model relates and compares to BERT–the two models are surprisingly similar!

==== Updates ====
Sign up to hear about new content across my blog and channel: https://www.chrismccormick.ai/subscribe

==== References ====
Kaggle Interview with Jacob Devlin (BERT Author): https://youtu.be/u91645MFytY (sorry for the awkward screenshot… I didn’t catch how goofy it looked while editing!)

Yannic Kilcher’s walkthrough of the BERT paper was very helpful for me in understanding the relation between BERT and the OpenAI GPT: https://youtu.be/-9evrZnBorM

Add comment

Your email address will not be published. Required fields are marked *

Categories

All Topics