TouchInsight - Uncertainty-aware Rapid Touch and Text Input for Mixed Reality from Egocentric Vision
Association for Computing Machinery (ACM) via YouTube
Python, Prompt Engineering, Data Science — Build the Skills Employers Want Now
Learn Generative AI, Prompt Engineering, and LLMs for Free
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Learn about an innovative mixed reality input system in this 19-minute conference talk from UIST 2024, the 37th Annual ACM Symposium on User Interface Software and Technology. Explore how egocentric vision can be leveraged to create uncertainty-aware rapid touch and text input methods for mixed reality environments. Discover cutting-edge research presented in Pittsburgh that advances user interface technology by combining touch interactions with mixed reality through egocentric visual processing.
Syllabus
TouchInsight: Uncertainty-aware Rapid Touch and Text Input for Mixed Reality from Egocentric Vision
Taught by
ACM SIGCHI