Consider smart home assistants. While useful, their interface methods generally consist of responding to voice commands, and some implement a small touchscreen. Normally, there’s no awareness of the surrounding area to tell where, for instance, you’re standing, or if there is an object on the table next to it.
SurfaceSight — a new project from Gierad Laput and Chris Harrison from Carnegie Mellon University’s Future Interfaces Group — takes a (literal) new spin on things, placing a rotating LiDAR unit underneath an Amazon Echo. This technology, most well known for its use in autonomous vehicles, is able to detect when something is in front of the sensor, and can group readings into contiguous objects. With this data, it uses machine learning to classify the type of item sensed, and can even track hand movements and respond to gestures.
The video below shows off some truly impressive abilities, as it’s capable to discriminate between different types of similarly sized objects. It can even roughly sense in which direction a human is facing and modify the way it interacts in response. Possible applications for such tech could include a smart wall or desk surface, or perhaps functionality to tell you where you left your keys or sunglasses!