Finance Certifications Goldman Sachs & Amazon Teams Trust
Python, Prompt Engineering, Data Science — Build the Skills Employers Want Now
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore parameter-efficient tuning techniques for boosting Large Language Model (LLM) performance in this 25-minute conference talk from the 2023 GAIA Conference. Delve into the adaptation of p-tuning, a prompt-learning method, for low-resource language settings, with a focus on Swedish. Learn about an improved version of p-tuning implemented in NVIDIA NeMo that enables continuous multitask learning of virtual prompts. Gain insights from Zenodia Charpy, a senior deep learning data scientist at NVIDIA, as she shares her expertise in training and deploying very large language models for non-English and low-resource languages. Discover how these techniques can help solve real-world natural language tasks and improve performance on various downstream NLP tasks while grounding the factual correctness of LLM responses.
Syllabus
P-Tuning: A Parameter Efficient Tuning to Boost LLM Performance by Zenodia Charpy
Taught by
GAIA