Dialogue Systems - Natural Language Processing with Transformer-based Models II
Center for Language & Speech Processing(CLSP), JHU via YouTube
NY State-Licensed Certificates in Design, Coding & AI — Online
Stuck in Tutorial Hell? Learn Backend Dev the Right Way
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore dialogue systems and natural language processing through this comprehensive tutorial focusing on Transformer-based models and large language models (LLMs). Learn the fundamentals of Transformer neural architecture, including common issues and misconceptions, while gaining insights into LLM training processes. Discover how to apply these powerful models to various NLP tasks including machine translation (where Transformers originated), speech translation, data-to-text generation, and chatbot development. Examine the specific challenges and opportunities in dialogue response generation, with particular attention to semantic accuracy and grounding in language generation. Understand evaluation methods for assessing generation accuracy in conversational AI systems. Presented by Ondřej Dušek from Charles University Prague, whose research specializes in generative language models for data-to-text and dialogue applications, this session provides both theoretical foundations and practical applications for building effective dialogue systems using state-of-the-art transformer architectures.
Syllabus
[camera] Day 5 afternoon - JSALT 2025 - Dušek: Dialogue Systems (NLP with Transformer-based Models)
Taught by
Center for Language & Speech Processing(CLSP), JHU