Python, Prompt Engineering, Data Science — Build the Skills Employers Want Now
Get 20% off all career paths from fullstack to AI
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the development and capabilities of GPT-NeoX-20B, a 20 billion parameter open-source language model, in this insightful interview with EleutherAI co-founder Connor Leahy. Discover the process of training, hardware acquisition, and model performance. Learn about the differences between GPT-Neo, GPT-J, and GPT-NeoX, and gain insights into the challenges of training large language models. Find out how to try the model yourself using GooseAI and hear final thoughts on the project's impact and future potential.
Syllabus
- Intro
- Start of interview
- How did you get all the hardware?
- What's the scale of this model?
- A look into the experimental results
- Why are there GPT-Neo, GPT-J, and GPT-NeoX?
- How difficult is training these big models?
- Try out the model on GooseAI
- Final thoughts
Taught by
Yannic Kilcher