Self-Improving Teacher Cultivates Better Student: Distillation Calibration for Multimodal Large Language Models - Lecture 3.3
Association for Computing Machinery (ACM) via YouTube
AI, Data Science & Business Certificates from Google, IBM & Microsoft
Most AI Pilots Fail to Scale. MIT Sloan Teaches You Why — and How to Fix It
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a cutting-edge approach to improving multimodal large language models in this 14-minute conference talk from SIGIR 2024. Delve into the concept of "Self-Improving Teacher Cultivates Better Student: Distillation Calibration For Multimodal Large Language Models" presented by authors Xinwei Li, Li Lin, Shuai Wang, and Chen Qian. Learn about innovative techniques for enhancing the performance and capabilities of multimodal AI systems through self-improvement and distillation calibration methods. Gain insights into the latest advancements in artificial intelligence and machine learning, specifically focused on multimodal large language models and their potential applications.
Syllabus
SIGIR 2024 M3.3 [fp] Self-Imp Teacher Cultivates Better Student: Distillation Calibration For MM LLM
Taught by
Association for Computing Machinery (ACM)