Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Beyond CoT: ReasonFLUX - Hierarchical LLM Reasoning via Scaling Thought Templates

Discover AI via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
This conference talk explores "ReasonFlux," a hierarchical reasoning approach for Large Language Models (LLMs) developed by researchers from Princeton University and Peking University. Learn how ReasonFlux advances beyond Chain-of-Thought (CoT) and Tree-of-Thought methods by structuring reasoning steps into high-level templates that can be flexibly retrieved and adapted. Discover how this approach creates an internal repository of structured reasoning strategies specifically designed to optimize the search space for complex tasks. The presentation examines the implementation of hierarchical reinforcement learning to plan template trajectories, aligning with emerging research on decomposing complex reasoning tasks into layered sub-tasks. Presented by researchers Ling Yang, Zhaochen Yu, Bin Cui, and Mengdi Wang, this 22-minute talk provides valuable insights for AI researchers interested in advanced reasoning techniques for language models.

Syllabus

Beyond CoT: ReasonFLUX (Princeton Univ)

Taught by

Discover AI

Reviews

Start your review of Beyond CoT: ReasonFLUX - Hierarchical LLM Reasoning via Scaling Thought Templates

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.