Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Microsoft

Working with large language models using Azure

Microsoft via Coursera

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Master the development of generative AI solutions using Microsoft Azure. This hands-on course guides you through the complete application lifecycle, from foundational concepts to deployment. You will learn to control Large Language Models (LLMs) with advanced prompt engineering, ground models in custom data using Retrieval-Augmented Generation (RAG) pipelines, and tailor their behavior with fine-tuning techniques. Using powerful Azure tools, you'll build, deploy, and manage sophisticated AI applications ready to solve real-world challenges.

Syllabus

  • Understanding Large Language Models (LLMs)
    • This foundational module introduces the core concepts behind Large Language Models (LLMs). You will start by exploring the fundamental architecture that powers models like GPT (Generative Pre-trained Transformer) and learn how they process information and generate human-like text. The second half of the module is dedicated to prompt engineering, where you will learn and apply essential techniques—from basic commands to advanced strategies like few-shot learning and chain-of-thought—to effectively communicate with and control AI models to achieve desired outcomes. Important Notice on the Azure Interface: The screencast videos and screenshots were last updated in late 2025. Please be aware that Microsoft may have updated the Azure interface since then. If the steps shown in the course materials look different from your current Azure environment, please follow the most up-to-date interface, as the underlying concepts and learning objectives remain the same.
  • Implementing RAG pipelines
    • This module focuses on one of the most powerful techniques for enhancing LLMs: Retrieval-Augmented Generation (RAG). You will learn how to ground models in external, private, or real-time data sources to provide more accurate and contextually relevant responses. You will start by building a basic RAG pipeline using Azure services and then progress to constructing and optimizing advanced systems with techniques like semantic ranking and sophisticated data chunking strategies. Important Notice on the Azure Interface: The screencast videos and screenshots were last updated in late 2025. Please be aware that Microsoft may have updated the Azure interface since then. If the steps shown in the course materials look different from your current Azure environment, please follow the most up-to-date interface, as the underlying concepts and learning objectives remain the same.
  • Fine-tuning and customizing LLMs
    • This module explores fine-tuning as a powerful method for customizing an LLM's core behavior, style, or knowledge for specialized tasks. You will learn the entire fine-tuning workflow, from preparing a high-quality dataset to launching the training job and evaluating the customized model's performance in Azure. Critically, you will learn to strategically decide when to use fine-tuning versus RAG—or a hybrid of both—to create highly effective, domain-specific AI solutions. Important Notice on the Azure Interface: The screencast videos and screenshots were last updated in late 2025. Please be aware that Microsoft may have updated the Azure interface since then. If the steps shown in the course materials look different from your current Azure environment, please follow the most up-to-date interface, as the underlying concepts and learning objectives remain the same.
  • Developing generative applications with Azure
    • This module transitions from theory to practice by guiding you through the end-to-end process of building and deploying a complete generative AI application. You will learn to design an application's architecture and user flow before using Azure AI Foundry and Prompt flow tools to build it. The module then covers the critical MLOps lifecycle, teaching you how to deploy your application as a secure endpoint, manage it in a production environment, and implement monitoring with Azure Monitor for performance and cost. Important Notice on the Azure Interface: The screencast videos and screenshots were last updated in late 2025. Please be aware that Microsoft may have updated the Azure interface since then. If the steps shown in the course materials look different from your current Azure environment, please follow the most up-to-date interface, as the underlying concepts and learning objectives remain the same.

Taught by

Microsoft

Reviews

Start your review of Working with large language models using Azure

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.