Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

freeCodeCamp

Become an AI Researcher - LLM, Math, PyTorch, Neural Networks, Transformers

via freeCodeCamp

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn the fundamentals of AI research through a comprehensive course that covers essential mathematics, PyTorch programming, neural networks, and transformer architectures. Begin with foundational mathematical concepts including functions, derivatives, vectors, gradients, matrices, and probability theory that underpin modern artificial intelligence. Master PyTorch fundamentals by working with tensors, learning reshaping and viewing operations, dimension manipulation, indexing and slicing techniques, and creating special tensor types. Build neural networks from scratch, starting with single neurons and progressing through activation functions like sigmoid, ReLU, and tanh, before tackling multi-layer networks and backpropagation algorithms. Dive deep into transformer architectures that power large language models, exploring attention mechanisms with query, key, and value components, self-attention and causal self-attention patterns, rotary positional embeddings (RoPE), multi-head attention systems, and complete transformer blocks with feed-forward networks and normalization layers. Conclude with tokenization techniques used in GPT architectures, providing a complete foundation for understanding and implementing modern AI systems.

Syllabus

- 00:00:00 Welcome & Course Overview
- 00:05:28 Requirements & Setup for the Course
- 00:10:48 Math Lesson: Functions Linear, Quadratic, Cubic, Square Root
- 00:19:10 Math Lesson: Derivatives Rate of Change
- 00:33:19 Math Lesson: Vectors Magnitude, Dot Product, Normalization
- 00:46:07 Math Lesson: Gradients Steepest Ascent/Descent, Partial Derivatives
- 00:55:03 Math Lesson: Matrices Multiplication, Transpose, Identity
- 01:08:39 Math Lesson: Probability Expected Value, Conditional Probability
- 01:19:19 START: PyTorch Fundamentals & Creating Tensors
- 01:26:03 PyTorch Lesson: Reshaping and Viewing Tensors
- 01:27:48 PyTorch Lesson: Squeezing and Unsqueezing Dimensions
- 01:41:02 PyTorch Lesson: Indexing and Slicing Tensors
- 01:49:55 PyTorch Lesson: Special Tensors Zero, Ones, Linspace
- 01:54:00 START: Coding Neural Networks from Scratch
- 01:54:29 Neural Networks Lesson: Single Neuron Weights, Bias, Weighted Sum
- 01:57:11 Neural Networks Lesson: Activation Functions Sigmoid, ReLU, tanh
- 02:03:07 Neural Networks Lesson: Multi-Layer Networks & Backpropagation
- 02:11:59 START: Understanding Transformers for LLMs
- 02:14:14 Transformers Lesson: Attention Mechanism Query, Key, Value
- 02:32:39 Transformers Lesson: Self-Attention & Causal Self-Attention
- 02:40:48 Transformers Lesson: Rotary Positional Embeddings RoPE
- 02:44:07 Transformers Lesson: Multi-Head Attention
- 02:55:03 Transformers Lesson: Transformer Block Feed-Forward, Add & Norm
- 03:04:15 Tokenization for GPT Architecture
- 03:06:47 Conclusion & Next Steps

Taught by

freeCodeCamp.org

Reviews

Start your review of Become an AI Researcher - LLM, Math, PyTorch, Neural Networks, Transformers

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.