Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Large Concept Model - Beyond Token-Based LLMs

Center for Language & Speech Processing(CLSP), JHU via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Attend this plenary conference talk exploring the Large Concept Model as an alternative to token-based Large Language Models for achieving Advanced Machine Intelligence. Learn about the limitations of current token-based LLMs, including their lack of explicit reasoning and planning, hierarchical processing, and multilingual capabilities that are crucial characteristics of human intelligence. Discover how the Large Concept Model addresses these shortcomings by training on a multimodal and multilingual sentence representation space using diffusion-based methods. Explore the model's strong performance on generative tasks, its impressive zero-shot multilingual capabilities, and various explored variants including initial attempts at hierarchical text processing. Gain insights from Loïc Barrault, a Research Scientist at Meta AI with extensive experience in statistical and neural machine translation, multimodal processing, and lifelong learning methods, whose recent work includes contributions to NLLB200 for 200-language translation, Seamless-M4T for speech-to-speech translation across 100 languages, and reasoning in embedding spaces through the Large Concept Model.

Syllabus

July 15th, 2025 — 11:00 CEST

Taught by

Center for Language & Speech Processing(CLSP), JHU

Reviews

Start your review of Large Concept Model - Beyond Token-Based LLMs

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.