Guide to the LLM Ecosystem: Hugging Face, GPUs, OpenAI, LangChain and More - Lecture 2
Data Centric via YouTube
-
45
-
- Write review
Learn Excel & Financial Modeling the Way Finance Teams Actually Use Them
PowerBI Data Analyst - Create visualizations and dashboards from scratch
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Dive into the complex world of Large Language Models (LLMs) with this comprehensive lecture from the AI Engineering Take-off Course. Gain clarity on key concepts in the LLM ecosystem, including Hugging Face, GPU infrastructure, OpenAI, and LangChain. Explore the fundamentals of how LLMs work, understand the role of different components in the ecosystem, and learn essential knowledge for developing LLM applications. Follow along with detailed chapters covering topics such as infrastructure and hardware, proprietary LLMs, inference servers, app development frameworks, and frontend considerations. Complement your learning with additional resources, including a related blog post and links to other helpful content on AI, Data Science, and LLM development.
Syllabus
Intro:
The Ecosystem:
All about LLMs:
Infrastructure & Hardware:
Hugging Face:
Proprietary LLMs OpenAI:
Inference Server:
App Dev Frameworks:
Frontend:
Taught by
Data Centric
Reviews
2.0 rating, based on 1 Class Central review
Showing Class Central Sort
-
Great But Not That good This common knowledge in Tech guy like nothing really learn from this I have know these things from last 3 to 4 years .