Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Machine Translation - NLP with Transformer-based Models

Center for Language & Speech Processing(CLSP), JHU via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore machine translation and natural language processing through this comprehensive tutorial delivered by Ondřej Bojar from Charles University, Prague, as part of the JSALT 2025 workshop series. Gain insights into Transformer-based models and their applications in NLP tasks, with particular emphasis on machine translation where Transformers were first introduced. Learn about the fundamentals of Transformer neural architecture, understand common issues and misconceptions surrounding these models, and receive an introduction to large language model (LLM) training methodologies. Discover how to apply Transformers and LLMs across various NLP tasks including machine translation, speech translation, data-to-text generation, and chatbot/dialogue systems. Benefit from the expertise of a leading machine translation researcher who has co-organized the influential WMT (Workshop on Machine Translation) shared tasks since 2013 and has contributed significantly to both pre-neural statistical machine translation and modern neural approaches, including speech translation and meeting minuting applications.

Syllabus

[camera] Day 5 morning - JSALT 2025 - Bojar: Machine Translation (NLP with Transformer-based Models)

Taught by

Center for Language & Speech Processing(CLSP), JHU

Reviews

Start your review of Machine Translation - NLP with Transformer-based Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.