Learning Semantic Parsers for More Languages and with Less Supervision
Center for Language & Speech Processing(CLSP), JHU via YouTube
35% Off Finance Skills That Get You Hired - Code CFI35
Master Windows Internals - Kernel Programming, Debugging & Architecture
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore advanced techniques for developing semantic parsers that can work across multiple languages while requiring minimal supervised training data in this comprehensive lecture by Luke Zettlemoyer from the University of Washington. Delve into cutting-edge research methodologies that address the challenge of creating robust natural language understanding systems capable of parsing meaning from text in diverse linguistic contexts. Learn about innovative approaches to reduce the dependency on large annotated datasets, making semantic parsing more accessible for low-resource languages. Discover how modern machine learning techniques can be leveraged to build parsers that generalize across different languages and domains with limited supervision. Gain insights into the theoretical foundations and practical applications of cross-lingual semantic parsing, including transfer learning strategies, multilingual representations, and weakly supervised learning paradigms. Understand the implications of these advances for building more inclusive and globally applicable natural language processing systems that can serve speakers of various languages without requiring extensive language-specific training data.
Syllabus
Luke Zettlemoyer: Learning Semantic Parsers for More Languages and with Less Supervision
Taught by
Center for Language & Speech Processing(CLSP), JHU