Quantifying the Uncertainty in Model Predictions Using Conformal Prediction
Toronto Machine Learning Series (TMLS) via YouTube
Earn Your Business Degree, Tuition-Free, 100% Online!
Our career paths help you become job ready faster
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the concept of conformal prediction in this 33-minute conference talk from the Toronto Machine Learning Series. Learn how to quantify uncertainty in neural network predictions and generate alternative outputs when models are unsure. Discover the versatility, statistical rigor, and simplicity of conformal prediction as a method applicable to both classification and regression tasks. Gain insights into its three-step implementation process and understand how it can be applied to real-world use cases. Presented by Jesse Cresswell, Senior Machine Learning Scientist at Layer 6 AI, this talk provides valuable knowledge for addressing the challenge of overconfident wrong predictions in neural networks.
Syllabus
Quantifying the Uncertainty in Model Predictions
Taught by
Toronto Machine Learning Series (TMLS)