Neurosymbolic AI: Combining Large Language Models with Symbolic Methods
Center for Language & Speech Processing(CLSP), JHU via YouTube
MIT Sloan AI Adoption: Build a Playbook That Drives Real Business ROI
Finance Certifications Goldman Sachs & Amazon Teams Trust
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the intersection of neural networks and symbolic AI methods in this 59-minute lecture by Dr. Lara J. Martin from the Center for Language & Speech Processing at JHU. Delve into neurosymbolic approaches for story generation and understanding, with the ultimate goal of creating AI capable of playing Dungeons & Dragons. Learn about the limitations of large language models like ChatGPT and discover how combining neural networks with early AI symbolic methods can lead to more robust artificial intelligence. Gain insights into applications for improving accessible communication and understand how large language models can enhance such tools. Follow Dr. Martin's journey through various AI applications, including automated story generation, augmentative and alternative communication (AAC) tools, and AI for tabletop roleplaying games.
Syllabus
Introduction
What is GBT
Story with GBT
Storytelling
Generalized Sentences
Chain of Thought Prompting
What is Dungeons and Dragons
Challenges in Dungeons and Dragons
The Venture Zone
RNN vs Large Language
What is AAC
Who uses AAC
Themes
Trans Text to Speech
Conclusion
Taught by
Center for Language & Speech Processing(CLSP), JHU