Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Hugging Face Course - Using Transformers - Chapter 2

HuggingFace via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn the inner workings of Hugging Face's pipeline function and master the fundamentals of transformer models and tokenization in this comprehensive tutorial covering both PyTorch and TensorFlow implementations. Explore what happens behind the scenes when using pipeline functions, discover how to instantiate transformer models from scratch, and gain a thorough understanding of different tokenization approaches including word-based, character-based, and subword-based methods. Master the tokenization pipeline process and learn essential techniques for batching inputs together effectively in both PyTorch and TensorFlow frameworks, providing you with the foundational knowledge needed to work with transformer models and natural language processing tasks.

Syllabus

What happens inside the pipeline function? (PyTorch)
What happens inside the pipeline function? (TensorFlow)
Instantiate a Transformers model (PyTorch)
Instantiate a Transformers model (TensorFlow)
Tokenizers Overview
Word-based tokenizers
Character-based tokenizers
Subword-based tokenizers
The tokenization pipeline
Batching inputs together (PyTorch)
Batching inputs together (TensorFlow)

Taught by

Hugging Face

Reviews

Start your review of Hugging Face Course - Using Transformers - Chapter 2

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.