Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
"Master Generative AI with hands-on training in Large Language Models (LLMs), PEFT techniques (LoRA, QLoRA), and Diffusion Models using Hugging Face, diffusers, peft, trl, and bitsandbytes. This course takes you from the internals of decoder-only transformers to building a specialist fine-tuned LLM and generating high-quality, controllable images with ControlNet.
In Module 1, explore decoder-only transformer architectures, self-attention, causal masking, KV caching, and token flow mechanics. Module 2 focuses on Parameter-Efficient Fine-Tuning (PEFT), where you'll implement LoRA, QLoRA, and 4-bit quantization to fine-tune large models on consumer GPUs using SFT pipelines. Module 3 dives into diffusion models, covering forward/reverse processes, UNet, schedulers (DDIM, Euler, DPM++), and ControlNet conditioning. Module 4 is a capstone where you'll build a Specialist LLM — from dataset creation to adapter export and evaluation.
By the end of this course, you will:
- Build and optimize decoder-only transformer pipelines with KV caching
- Fine-tune 7B+ LLMs using LoRA, QLoRA, and SFT pipelines on limited hardware
- Configure diffusers pipelines with ControlNet for controllable image generation
- Train, export, and evaluate a domain-specialized LLM adapter end-to-end"
Disclaimer: This is an independent educational resource created by Board Infinity for informational and educational purposes only. This course is not affiliated with, endorsed by, sponsored by, or officially associated with any company, organization, or certification body unless explicitly stated. The content provided is based on industry knowledge and best practices but does not constitute official training material for any specific employer or certification program. All company names, trademarks, service marks, and logos referenced are the property of their respective owners and are used solely for educational identification and comparison purposes.