SurfaceSight

A New Spin on Touch, User, and Object Sensing for IoT Experiences

Abstract

IoT appliances are gaining consumer traction, from smart thermostats to smart speakers. These devices generally have limited user interfaces, most often small butons and touchscreens, or rely on voice control. Further, these devices know little about their surroundings – unaware of objects, people and activities happening around them. Consequently, interactions with these “smart” devices can be cumbersome and limited. We describe SurfaceSight, an approach that enriches IoT experiences with rich touch and object sensing, offering a complementary input channel and increased contextual awareness. For sensing, we incorporate LIDAR into the base of IoT devices, providing an expansive, ad hoc plane of sensing just above the surface on which devices rest. We can recognize and track a wide array of objects, including finger input and hand gestures. We can also track people and estimate which way they are facing. We evaluate the accuracy of these new capabilities and illustrate how they can be used to power novel and contextually-aware interactive experiences.

FULL CITATION

Gierad Laput and Chris Harrison. 2019. SurfaceSight: A New Spin on Touch, User, and Object Sensing for IoT Experiences. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). ACM, New York, NY, USA, Paper 329, 12 pages. DOI: https://doi.org/10.1145/3290605.3300559

MEDIA and IMAGES