Get Coursera Plus for 40% off
Power BI Fundamentals - Create visualizations and dashboards from scratch
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the scaling of test-time inference compute and the emergence of reasoning models in this 32-minute conference talk by Data Scientist Jayita Bhattacharyya from the Linux Foundation's Open Source Summit. Delve into the technical aspects of how computational resources are allocated and optimized during the inference phase of machine learning models, particularly focusing on the growing importance of reasoning capabilities in AI systems. Learn about the challenges and solutions involved in scaling inference compute to handle complex reasoning tasks, and understand how modern AI models are evolving to incorporate more sophisticated reasoning mechanisms. Gain insights into the practical implications of these developments for data scientists and machine learning practitioners working with large-scale AI systems. Discover the intersection between computational efficiency and model performance in the context of reasoning-based AI applications, and understand the current trends shaping the future of inference optimization in machine learning workflows.
Syllabus
Scaling Test-time Inference Compute & Rise of Reasoning Models- Jayita Bhattacharyya, Data Scientist
Taught by
Linux Foundation