Methods for Long Context Language Models: MagicPIG and Factor - Lecture 23
Graham Neubig via YouTube
Learn Excel & Financial Modeling the Way Finance Teams Actually Use Them
Build with Azure OpenAI, Copilot Studio & Agentic Frameworks — Microsoft Certified
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Learn about efficient methods for handling long-context language models through a guest lecture delivered by Beidi Chen at Carnegie Mellon University's Advanced Natural Language Processing course. Explore two key approaches - MagicPIG and Factor - that address the challenges of processing extended text sequences in language models. Gain insights into cutting-edge techniques for improving the performance and capabilities of NLP systems when dealing with lengthy inputs. Part of CMU's CS 11-711 Advanced NLP Fall 2024 curriculum, this 50-minute lecture demonstrates practical solutions for expanding the context window of language models while maintaining computational efficiency.
Syllabus
CMU Advanced NLP Fall 2024 (23): MagicPIG & Factor - Methods for Long Context LMs
Taught by
Graham Neubig