Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

IBM

Mastering Generative AI: Fine-Tuning Transformers

IBM via edX

Overview

The demand for technical gen AI skills is exploding. AI engineers who know how to fine-tune transformers for gen AI applications are in hot demand. This Generative AI Engineering Fine-Tuning with Transformers course is designed for AI engineers and other AI specialists who are looking to add highly sought-after skills to their resume.

In this course, you’ll explore the differences between PyTorch and Hugging Face. You’ll use pre-trained transformers for language tasks and fine-tune them for special tasks. Plus, you’ll fine-tune generative AI models using PyTorch and Hugging Face.

You’ll also explore concepts like parameter-efficient fine-tuning (PEFT), low-rank adaptation (LoRA), quantized low-rank adaptation (QloRA), model quantization with natural language processing (NLP) and prompting. Plus, through valuable hands-on labs, you’ll build your experience loading models and inference, training models with Hugging Face, pre-training LLMs, fine-tuning models, and PyTorch adaptors.

If you’re looking to gain the job-ready skills employers need for fine-tuning transformers for gen AI, ENROLL TODAY and power up your resume for career success!

Prerequisites: This course requires basic knowledge of Python, PyTorch, and transformer architecture. You should also be familiar with machine learning and neural network concepts.

Syllabus

Module 0: Welcome

  • Video: Course Introduction
  • Reading: Professional Certificate Overview
  • Reading: General Information
  • Reading: Learning Objectives and Syllabus
  • Reading: Grading Scheme

Module 1: Transformers and Fine-Tuning

  • Reading: Module Introduction and Learning Objectives
  • Video: Hugging Face vs. PyTorch
  • Lab: Loading Models and Inference with Hugging
  • Video: Using Pre-Trained Transformers and Fine-Tuning
  • [Optional] Lab: Pre-training LLMs with Hugging Face
  • Video: Fine-Tuning with PyTorch
  • Video: Fine-Tuning with Hugging Face
  • Lab: Pre-Training and Fine-Tuning with PyTorch
  • Lab: Fine-Tuning Transformers with PyTorch and Hugging Face
  • Reading: Summary and Highlights: Transformers and Fine-Tuning
  • Practice Quiz: Transformers and Fine-Tuning
  • Graded Quiz: Transformers and Fine-Tuning

Module 2: Parameter Efficient Fine-Tuning (PEFT)

  • Reading: Module Introduction and Learning Objectives
  • Video: Introduction to PEFT
  • Lab: Adapters with PyTorch
  • Video: LoRA
  • Video: LoRA with Hugging Face and PyTorch
  • Lab: LoRA with PyTorch
  • Video: From Quantization to QLoRA
  • [Optional] Lab: QLoRA with Hugging Face
  • Reading: Soft Prompts
  • Reading: Summary and Highlights: Parameter Efficient Fine-Tuning (PEFT)
  • Practice Quiz: Parameter Efficient Fine-Tuning (PEFT)
  • Graded Quiz: Parameter Efficient Fine-Tuning (PEFT)

Module 3: Course Cheat Sheet, Glossary and Wrap-up

  • Reading: Cheat Sheet: Generative AI Engineering and Fine-tuning Transformers
  • Reading: Course Glossary: Generative AI Engineering and Fine-Tuning Transformers

Course Wrap-Up

  • Course Conclusion
  • Reading: Congratulations and Next Steps
  • Reading: Team and Acknowledgements
  • Reading: Copyrights and Trademarks
  • Course Rating and Feedback
  • Reading: Frequently Asked Questions
  • Reading: Claim your badge here

Taught by

Joseph Santarcangelo

Reviews

Start your review of Mastering Generative AI: Fine-Tuning Transformers

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.