Lip-Interact - Improving Mobile Device Interaction with Silent Speech Commands
Association for Computing Machinery (ACM) via YouTube
Learn Generative AI, Prompt Engineering, and LLMs for Free
Google Data Analytics, IBM AI & Meta Marketing — All in One Subscription
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a groundbreaking interaction technique for smartphones in this 20-minute conference talk from the ACM User Interface Software and Technology Symposium. Discover Lip-Interact, a system that enables users to control their devices through silent speech commands. Learn how the front camera captures mouth movements and utilizes deep learning to recognize 44 different commands for both system-level and application-level functionalities. Examine the results of three user experiments evaluating recognition accuracy, input efficiency compared to touch, and privacy considerations versus voiced commands. Gain insights into how this innovative approach can enhance one-handed device operation, improve interaction fluency, and provide efficient access to smartphone features in various contexts.
Syllabus
Lip-Interact: Improving Mobile Device Interaction with Silent Speech Commands
Taught by
ACM SIGCHI