Python, Prompt Engineering, Data Science — Build the Skills Employers Want Now
Build the Finance Skills That Lead to Promotions — Not Just Certificates
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Learn how large language models can automatically generate assertions to assist program verifiers in this 15-minute conference presentation from OOPSLA 2025. Discover the challenges faced by automated program verifiers like Dafny when SMT solvers require manual hints in the form of assertions, creating additional work for proof engineers. Explore the innovative tool that leverages LLMs to automatically generate these required assertions, reducing the burden on developers. Examine two key domain-specific prompting techniques: analyzing verifier error messages to determine optimal assertion placement locations, and selecting relevant example assertions from the same codebase using a novel proof similarity metric. Review the evaluation results on a new benchmark dataset containing complex lemmas from three real-world Dafny codebases, demonstrating the tool's ability to generate over 56.6% of required assertions with minimal attempts. Understand how this approach makes LLMs a cost-effective solution for unblocking program verifiers without requiring human intervention, advancing the field of automated program verification through machine learning integration.
Syllabus
[OOPSLA'25] Laurel: Unblocking Automated Verification with Large Language Models
Taught by
ACM SIGPLAN