Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

CPU Inferencing of Language Models in Teradata

MLOps World: Machine Learning in Production via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to implement CPU-based inferencing for language models within the Teradata platform in this comprehensive conference talk from MLOps World. Discover advanced strategies for deploying and optimizing language model inference on CPU infrastructure, exploring the technical considerations and practical approaches for production-scale implementations. Examine the intersection of machine learning operations and database systems, focusing on how Teradata's architecture supports efficient language model processing without GPU dependencies. Gain insights into industry-specific applications, particularly in financial services, and understand the strategic implications of CPU-based inference for enterprise AI deployments. Explore performance optimization techniques, scalability considerations, and best practices for integrating language model capabilities into existing data infrastructure while maintaining operational efficiency and cost-effectiveness.

Syllabus

CPU Inferencing of Language Models in Teradata

Taught by

MLOps World: Machine Learning in Production

Reviews

Start your review of CPU Inferencing of Language Models in Teradata

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.