A Cheap Trick for Semantic Question Answering for GPU-Challenged Systems
OpenSource Connections via YouTube
Get 35% Off CFI Certifications - Code CFI35
AI Engineer - Learn how to integrate AI into software applications
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a cost-effective approach to semantic question answering in this conference talk from Haystack US 2023. Learn how to combine Large Language Models (LLMs) during indexing to generate questions from passages, and match them to incoming queries during search using text-based or vector-based matching. Discover a method that addresses the challenges of high infrastructure costs and potential hallucinations associated with LLM-based search pipelines. Gain insights into designing efficient search systems that prioritize speed and affordability while maintaining quality. Presented by Sujit Pal, Technical Research Director at Elsevier Health Markets, this talk draws from his extensive experience in search engine development and the application of Machine Learning techniques to enhance search functionality.
Syllabus
Haystack US 2023 - Sujit Pal: A Cheap Trick for Semantic Question Answering for the GPU challenged
Taught by
OpenSource Connections