Learning Semantic Parsers for More Languages and with Less Supervision
Center for Language & Speech Processing(CLSP), JHU via YouTube
Most AI Pilots Fail to Scale. MIT Sloan Teaches You Why — and How to Fix It
Get 20% off all career paths from fullstack to AI
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore advanced techniques for developing semantic parsers that can work across multiple languages while requiring minimal supervised training data in this comprehensive lecture by Luke Zettlemoyer from the University of Washington. Delve into cutting-edge research methodologies that address the challenge of creating robust natural language understanding systems capable of parsing meaning from text in diverse linguistic contexts. Learn about innovative approaches to reduce the dependency on large annotated datasets, making semantic parsing more accessible for low-resource languages. Discover how modern machine learning techniques can be leveraged to build parsers that generalize across different languages and domains with limited supervision. Gain insights into the theoretical foundations and practical applications of cross-lingual semantic parsing, including transfer learning strategies, multilingual representations, and weakly supervised learning paradigms. Understand the implications of these advances for building more inclusive and globally applicable natural language processing systems that can serve speakers of various languages without requiring extensive language-specific training data.
Syllabus
Luke Zettlemoyer: Learning Semantic Parsers for More Languages and with Less Supervision
Taught by
Center for Language & Speech Processing(CLSP), JHU