LLMOps - Building an AI-Powered Search Engine with MCP Servers
The Machine Learning Engineer via YouTube
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn to build an AI-powered search engine using MCP (Model Context Protocol) servers in this comprehensive tutorial. Discover how to create a distributed system architecture with three integrated components: a Streamlit client for the AI chat interface, a Google Search MCP server for web search and content extraction, and a Perplexity MCP server for AI-powered search analysis. Explore the implementation of a four-layer platform architecture including user authentication, AI orchestration with LangChain agents, MCP protocol communication, and external data integration. Master the deployment of Docker containers to emulate a real distributed environment, implement multi-provider AI support with Azure OpenAI, and integrate smart caching mechanisms for both Google Custom Search API and Perplexity API responses. Gain hands-on experience with server-sent event communication, tool selection algorithms, prompt engineering techniques, and security implementations across the entire system stack.
Syllabus
LLMOps: AI-powered Search Engine #machinelearning #datascience
Taught by
The Machine Learning Engineer