Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a groundbreaking conference presentation that introduces the continuous tensor abstraction, a revolutionary approach that extends traditional tensor programming to allow indices to take real-number values rather than just integers. Learn how this innovative abstraction enables tensor operations like A[3.14] and continuous tensor algebra expressions such as Cx,y = Ax,y ∗ Bx,y, where indices are defined over continuous domains. Discover the implementation of piecewise-constant tensors that can process infinite domains in finite time, along with a new tensor format for efficient storage and automatic code generation techniques for kernel generation. Understand how this abstraction opens up new possibilities for expressing computational geometry and computer graphics problems in the language of tensor programming. Examine performance results showing significant improvements over hand-optimized kernels in leading libraries, including 9.20× speedup on 2D radius search with 60× fewer lines of code, 1.22× improvement on genomic interval overlapping queries with 18× code reduction, and 1.69× speedup on trilinear interpolation in Neural Radiance Fields with 6× fewer lines of code. Gain insights into sparse tensor compilation and domain-specific language design from researchers at MIT, Georgia Tech, and NVIDIA presenting their work at the OOPSLA 2025 conference.