In part 3 of Generative AI Foundations on AWS, AWS generative AI expert instructor Emily Webber explains the benefits of using a pre-trained foundation model as well as how to fine tune it to fit your use case.

You’ll learn about different types of prompt engineering, such as zero-shot, single-shot, few-shot, and learn how to apply these for various NLP use cases like summarization, classification, and translation. Then, you’ll move on to fine-tuning a pre-trained model using classic fine-tuning, parameter efficient fine tuning, and finally how to access Hugging Face’s new library. Go hands-on by fine-tuning GPT-J 6B with SageMaker Jumpstart on SEC filing data using this GitHub resource: https://go.aws/3DjMjFq

Learn more about generative AI on AWS: https://go.aws/44rbDVG

Tune in to Build On Generative AI with host Emily Webber on twitch.tv/aws for even more tips and tricks: https://m.twitch.tv/videos/1723458659

Access the slides from this lesson to follow along: https://github.com/aws-samples/sagemaker-distributed-training-workshop/blob/main/slides/Generative%20AI%20Foundations%20Technical%20Deep%20Dive/1%20-%20Intro%20to%20FMs.pdf.zip

Subscribe:
More AWS videos: https://go.aws/3m5yEMW
More AWS events videos: https://go.aws/3ZHq4BK

Do you have technical AWS questions?
Ask the community of experts on AWS re:Post: https://go.aws/3lPaoPb

ABOUT AWS
Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform, offering over 200 fully featured services from data centers globally. Millions of customers — including the fastest-growing startups, largest enterprises, and leading government agencies — are using AWS to lower costs, become more agile, and innovate faster.

#AWSMachineLearning #MachineLearningUniversity #MLU #LearnML #LearnMachineLearning #aiml #GenerativeAI #GPT #AWS #AmazonWebServices #CloudComputing

Add comment

Your email address will not be published. Required fields are marked *

Categories

All Topics