JavaScript Programming for Beginners
Master Windows Internals - Kernel Programming, Debugging & Architecture
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the creation of LaMini-LM, a collection of distilled language models trained on large-scale instructions, in this informative video. Dive into the key ideas, dataset creation process, and model training methodology outlined in the research paper. Examine the diverse range of models trained, including Neo 1.3B, GPT1.5B, and Flan-T5-783M. Learn about the Hugging Face dataset used and witness demonstrations of prompts on ChatGPT. Gain practical insights through code examples and access provided Colab notebooks for hands-on experimentation with these mini models trained on maxi data.
Syllabus
Intro
Key Idea
Diagram
Dataset
Hugging Face Dataset
Trained on a lot of Models
Paper
Prompts on ChatGPT
Code Time
Taught by
Sam Witteveen