Multilingual Representation Distillation with Contrastive Learning
Center for Language & Speech Processing(CLSP), JHU via YouTube
Launch a New Career with Certificates from Google, IBM & Microsoft
Master AI and Machine Learning: From Neural Networks to Applications
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a 12-minute conference talk from the European Chapter of the Association for Computational Linguistics (EACL) 2023, presented by Steven Weiting Tan from the Center for Language & Speech Processing (CLSP) at Johns Hopkins University. Dive into the innovative approach of integrating contrastive learning into multilingual representation distillation for quality estimation of parallel sentences. Discover how this method enhances the ability to find semantically similar sentences that can be used as translations across different languages. Learn about the experimental results that demonstrate significant improvements over previous sentence encoders like LASER, LASER3, and LaBSE, particularly in low-resource language scenarios. Gain insights into the applications of this technique in multilingual similarity search and corpus filtering tasks, and understand its potential impact on cross-lingual information retrieval and matching.
Syllabus
Multilingual Representation Distillation with Contrastive Learning - EACL 2023
Taught by
Center for Language & Speech Processing(CLSP), JHU