Methods for Long Context Language Models: MagicPIG and Factor - Lecture 23
Graham Neubig via YouTube
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn about efficient methods for handling long-context language models through a guest lecture delivered by Beidi Chen at Carnegie Mellon University's Advanced Natural Language Processing course. Explore two key approaches - MagicPIG and Factor - that address the challenges of processing extended text sequences in language models. Gain insights into cutting-edge techniques for improving the performance and capabilities of NLP systems when dealing with lengthy inputs. Part of CMU's CS 11-711 Advanced NLP Fall 2024 curriculum, this 50-minute lecture demonstrates practical solutions for expanding the context window of language models while maintaining computational efficiency.
Syllabus
CMU Advanced NLP Fall 2024 (23): MagicPIG & Factor - Methods for Long Context LMs
Taught by
Graham Neubig