Multi-teacher Multi-stage Knowledge Distillation for Reasoning-Based Machine Reading Comprehension
Association for Computing Machinery (ACM) via YouTube
UC San Diego Product Management Certificate — AI-Powered PM Training
Learn the Skills Netflix, Meta, and Capital One Actually Hire For
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Learn about an innovative approach to machine reading comprehension through this 21-minute conference presentation from SIGIR 2024. Explore the MTMS (Multi-teacher Multi-stage Knowledge Distillation) framework for reasoning-based machine reading comprehension, presented by researchers Zhao Zhuo, Xie Zhiwen, Zhou Guangyou, and Huang Xiangji. Dive into how multiple teacher models and staged knowledge distillation techniques can enhance reading comprehension systems' reasoning capabilities, as demonstrated in this Association for Computing Machinery (ACM) session focused on Question Answering and Summarisation.
Syllabus
SIGIR 2024 W1.4 [fp] MTMS: Multi-teacher Multi-stage Knowledge Distillation
Taught by
Association for Computing Machinery (ACM)