Learn Backend Development Part-Time, Online
Gain a Splash of New Skills - Coursera+ Annual Nearly 45% Off
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to build and scale AI tool servers that empower Large Language Models (LLMs) to interact with external tools using the Model Context Protocol (MCP). This 48-minute PyCon US talk explores how MCP provides a standardized format for LLMs to call external tools for gathering information and taking real-world actions. Discover implementation techniques for MCP-compliant tool servers in Python, including transport types, connection lifecycle management, and best practices. Explore strategies for scaling Python services horizontally through load balancing and container orchestration, with special focus on deploying to 12-Factor App platforms like Heroku. Through live coding demonstrations, see how to implement, deploy, and scale an MCP-compliant tool server, and witness how LLMs can interact with these servers to create powerful API-driven AI agents. Gain practical knowledge for building scalable AI infrastructure with Python and Heroku's streamlined deployment process.
Syllabus
Building Scalable AI Tool Servers with Model Context Protocol (MCP) and Heroku (Sponsor: Heroku)
Taught by
PyCon US