Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Building Scalable AI Tool Servers with Model Context Protocol (MCP) and Heroku

PyCon US via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to build and scale AI tool servers that empower Large Language Models (LLMs) to interact with external tools using the Model Context Protocol (MCP). This 48-minute PyCon US talk explores how MCP provides a standardized format for LLMs to call external tools for gathering information and taking real-world actions. Discover implementation techniques for MCP-compliant tool servers in Python, including transport types, connection lifecycle management, and best practices. Explore strategies for scaling Python services horizontally through load balancing and container orchestration, with special focus on deploying to 12-Factor App platforms like Heroku. Through live coding demonstrations, see how to implement, deploy, and scale an MCP-compliant tool server, and witness how LLMs can interact with these servers to create powerful API-driven AI agents. Gain practical knowledge for building scalable AI infrastructure with Python and Heroku's streamlined deployment process.

Syllabus

Building Scalable AI Tool Servers with Model Context Protocol (MCP) and Heroku (Sponsor: Heroku)

Taught by

PyCon US

Reviews

Start your review of Building Scalable AI Tool Servers with Model Context Protocol (MCP) and Heroku

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.