Launch a New Career with Certificates from Google, IBM & Microsoft
Finance Certifications Goldman Sachs & Amazon Teams Trust
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a comprehensive mathematics seminar presentation from Georgia Tech's Leyan Pan examining the logical reasoning capabilities of Transformer-based Large Language Models (LLMs) through the lens of SAT-solving. Delve into the investigation of Boolean reasoning abilities in decoder-only Transformers using Chain-of-Thought methodology, specifically focusing on their capacity to decide 3-SAT instances within bounded parameters. Learn about the formal expressiveness of Transformer models, understand the equivalence between Chain-of-Thought reasoning and the DPLL SAT-solving algorithm, and discover how 3-SAT formulas and partial assignments can be encoded as vectors for implementation through attention mechanisms. Examine experimental results supporting theoretical predictions while considering the limitations of standard Transformers in solving unbounded length reasoning problems and potential solutions to overcome these constraints.
Syllabus
Leyan Pan | Can Transformers Reason Logically? A Study in SAT-Solving
Taught by
Harvard CMSA