Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the future of AI beyond transformers in this conference talk introducing RWKV, a linear transformer architecture with significantly lower inference costs. Discover how this open-source model, developed under the Linux Foundation, offers a lightweight solution capable of running on CPUs and mobile devices. Learn about RWKV's architecture, scalability, and performance, as well as the innovative World Tokenizer designed to address limitations in non-English language processing. Gain insights into how RWKV aims to build a more efficient and inclusive AI model for global use, and understand the potential benefits of the World Tokenizer for all AI models, not just RWKV.