On-Device LLMs with Functional Token Fine-Tuning - Octopus v2 Implementation
Discover AI via YouTube
Future-Proof Your Career: AI Manager Masterclass
AI, Data Science & Business Certificates from Google, IBM & Microsoft
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Learn about cutting-edge developments in on-device Large Language Models (LLMs) through this technical video that explores functional token fine-tuning and the Octopus v2 framework. Dive into Stanford University's research on implementing efficient function calling capabilities for edge devices like iPhones and Pixels using GEMMA 2B. Examine practical code implementations for function calling across major AI platforms including OpenAI, Anthropic/Claude 3, and Cohere Command R PLUS. Understand how functional tokens significantly enhance energy efficiency in LLM function calls and explore the broader implications for industry leaders like NVIDIA and Microsoft. Master the technical aspects of implementing AI agents with improved accuracy and inference speed through hands-on demonstrations and real-world applications.
Syllabus
On Device LLM
Octopus v2 Function Calling Apple, Google
CODE Anthropic, Cohere Function Calling
Implications for NVIDIA, Microsoft?
Taught by
Discover AI