Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Attend this machine learning lecture exploring advanced techniques for improving conformal prediction methods in high-stakes classification applications. Learn how temperature scaling affects prediction set sizes in adaptive conformal prediction methods, discovering the surprising negative impact of calibration on prediction sets and understanding the mathematical theory behind the non-monotonic relationship between temperature scaling and class-conditional coverage. Explore innovative approaches to enhance conformal prediction by incorporating class similarity through semantic grouping, where classes requiring similar treatment (such as diseases) are partitioned to create more meaningful prediction sets. Discover how augmenting conformal prediction score functions with "out-of-group" penalty terms can reduce both the number of semantically different groups and average set size, with theoretical proofs demonstrating practical advantages. Examine a general model-specific variant that eliminates the need for human-defined semantic partitions while consistently enhancing conformal prediction methods across various applications. The presentation covers two comprehensive research works with extensive empirical studies, providing practical guidelines for practitioners to effectively combine adaptive conformal prediction with calibration techniques aligned with specific user-defined goals.