Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Coursera

Building LLMs with Hugging Face and LangChain

Edureka via Coursera Specialization

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
The Building LLMs with Hugging Face and LangChain Specialization teaches you how to create modern LLM applications from core concepts to real-world deployment. You will learn how LLMs work, how to build applications with LangChain, and how to optimize and deploy systems using industry tools. In Course 1, you’ll explore the foundations of LLMs, including tokenization, embeddings, transformer architecture, and attention. You’ll work with the Hugging Face Hub, Datasets, and Transformers pipelines, experiment with models like BERT, GPT, and T5, and build simple NLP workflows. In Course 2, you’ll build real LLM applications using LangChain and LCEL. You’ll create prompts, chains, memory, and RAG pipelines with FAISS, process documents, and integrate agents, tools, APIs, LangServe, LangSmith, and LangGraph. In Course 3, you’ll optimize and deploy LLM systems. You’ll improve latency and token usage, integrate structured and multimodal data, orchestrate workflows with LlamaIndex and LangGraph, build FastAPI services, add security, containerize with Docker, and deploy with monitoring and CI/CD. By the end, you’ll be able to create and deploy production-ready LLM applications using modern tools and MLOps practices.

Syllabus

  • Course 1: Introduction to LLMs and Hugging Face
  • Course 2: Developing LLM Applications with LangChain
  • Course 3: Optimizing and Deploying LLM Systems

Courses

Taught by

Edureka

Reviews

Start your review of Building LLMs with Hugging Face and LangChain

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.