Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Linux Foundation

Scaling Test-time Inference Compute and Rise of Reasoning Models

Linux Foundation via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the scaling of test-time inference compute and the emergence of reasoning models in this 32-minute conference talk by Data Scientist Jayita Bhattacharyya from the Linux Foundation's Open Source Summit. Delve into the technical aspects of how computational resources are allocated and optimized during the inference phase of machine learning models, particularly focusing on the growing importance of reasoning capabilities in AI systems. Learn about the challenges and solutions involved in scaling inference compute to handle complex reasoning tasks, and understand how modern AI models are evolving to incorporate more sophisticated reasoning mechanisms. Gain insights into the practical implications of these developments for data scientists and machine learning practitioners working with large-scale AI systems. Discover the intersection between computational efficiency and model performance in the context of reasoning-based AI applications, and understand the current trends shaping the future of inference optimization in machine learning workflows.

Syllabus

Scaling Test-time Inference Compute & Rise of Reasoning Models- Jayita Bhattacharyya, Data Scientist

Taught by

Linux Foundation

Reviews

Start your review of Scaling Test-time Inference Compute and Rise of Reasoning Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.