Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

How to Run Any LLM On-Device With React Native

Callstack Engineers via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn to implement large language models directly within React Native applications using react-native-ai in this conference talk from React Advanced London 2025. Discover why on-device LLMs are crucial for privacy protection, reduced latency, and offline functionality in mobile AI applications. Explore the react-native-ai library's abstraction layer for local AI execution and understand its provider architecture design. Examine integrations with MLC LLM Engine and Apple's foundation models, along with recent enhancements including improved debugging capabilities, tool calling features, and agent pipeline implementations. Focus on practical implementation details and architectural trade-offs to gain a comprehensive understanding of current possibilities for running AI entirely on mobile devices without external dependencies.

Syllabus

How to Run Any LLM On-Device With React Native by Szymon Rybczak | React Advanced London 2025

Taught by

Callstack Engineers

Reviews

Start your review of How to Run Any LLM On-Device With React Native

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.