GraphGPT: Graph Instruction Tuning for Large Language Models - Session M2.2
Association for Computing Machinery (ACM) via YouTube
Build GenAI Apps from Scratch — UCSB PaCE Certificate Program
Build the Finance Skills That Lead to Promotions — Not Just Certificates
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the innovative approach of Graph Instruction Tuning for Large Language Models in this 13-minute conference talk from SIGIR 2024. Delve into the concept of GraphGPT, presented by authors Jiabin Tang, Yuhao Yang, Wei Wei, Lei Shi, Lixin Su, Suqi Cheng, Dawei Yin, and Chao Huang. Learn how this method combines graph structures with language models to enhance their capabilities in processing and understanding complex relational data. Gain insights into the potential applications and implications of this technology for various fields, including information retrieval, natural language processing, and artificial intelligence.
Syllabus
SIGIR 2024 M2.2 [fp] GraphGPT: Graph Instruction Tuning for Large Language Models
Taught by
Association for Computing Machinery (ACM)