Most AI Pilots Fail to Scale. MIT Sloan Teaches You Why — and How to Fix It
Advanced Techniques in Data Visualization - Self Paced Online
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Learn how to build a multi-class language classification model using BERT and TensorFlow in this comprehensive 43-minute tutorial. Explore the power of transformers in natural language processing as you work through each step of the process, from data preprocessing to model training and prediction. Follow along with clearly defined chapters for each section, including data input pipeline creation, model definition, and saving/loading techniques. Gain insights into the significance of transformers in deep learning and their dominance in NLP benchmarks. Utilize the HuggingFace transformers library to create an efficient and high-performing solution for multi-class text classification tasks.
Syllabus
Intro
Pulling Data
Preprocessing
Data Input Pipeline
Defining Model
Model Training
Saving and Loading Models
Making Predictions
Taught by
James Briggs