Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Inside GPT - The Maths Behind the Magic

GOTO Conferences via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the mathematical foundations and inner workings of GPT algorithms in this comprehensive 41-minute conference talk from GOTO Copenhagen 2024. Dive deep into the architecture of generative pre-trained transformers and discover how ChatGPT actually functions under the hood. Begin with fundamental natural language processing concepts including word embedding, vectorization, and tokenization, then witness practical demonstrations of training a GPT2 model to generate song lyrics while examining how word sequences are predicted. Learn about larger language models like ChatGPT and GPT-4, understanding their capabilities, limitations, and the role of hyperparameters such as temperature and frequency penalty in output generation. Master key concepts including model API architecture, sequence prediction, variable length processing, long-term dependencies, prompt engineering, and Retrieval Augmented Generation (RAG). Examine practical aspects of text processing, language tokenization, word/token vectorization and embedding, similarity measures (Euclidean vs Cosine), transformer architecture, and attention mechanisms. Gain insights into how English is becoming the new programming language and understand the challenges posed by heteronyms in natural language processing. Perfect for developers, AI practitioners, and anyone seeking to understand the mathematical principles behind modern language models and how to leverage GPT algorithms in their own applications.

Syllabus

00:00 Intro
06:00 Model API architecture
07:37 GPT sequence prediction
08:49 Variable length
11:13 Sequence order
11:28 Long-term dependencies
12:06 Prompt engineering
13:11 Retrieval Augmented Generation RAG
15:14 Demo: ChatGPT2
20:25 Processing text
24:22 English is the new programming language
29:14 Heteronyms
34:14 Demo: Language tokenization
34:46 Word/token vectorization/embedding
35:52 Euclidean vs Cosine similarity
37:32 Transformer architecture
40:23 Attention example
41:03 Outro

Taught by

GOTO Conferences

Reviews

Start your review of Inside GPT - The Maths Behind the Magic

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.