MIT's SCALE Brings Force-Based Interaction Smarts to Desks, Shelves, Floors, and More

Simple force sensors can add considerable interaction capabilities to everything from shop shelves to tables and floors, claim researchers.

Gareth Halfacree
4 years agoSensors

Researchers at MIT's Tangible Media Group, working with Toppan Printing Co., have developed a framework designed to process load data from multiple sensors as a means of interacting through force: SCALE.

"SCALE provides a framework for load data from distributed load-sensitive modules for exploring force-based interaction," the researchers explain of the project. "Force conveys not only the force vector itself but also rich information about activities, including way of touching, object location and body motion. Our system captures these interactions on a single pipeline of load data processing."

"Furthermore, we have expanded the interaction area from a flat 2D surface to 3D volume by building a mathematical framework, which enables us to capture the vertical height of a touch point. These technical invention opens broad applications, including general shape capturing and motion recognition. We have packaged the framework into a physical prototyping kit, and conducted a workshop with product designers to evaluate our system in practical scenarios."

The team's focus on force sensing comes from the amount of data it can carry — intensity, direction, and the weight of objects — which would otherwise be difficult to gather using other sensors. Using captured force data, the team can reconstruct activities ranging from touch to object movement and body motion patterns — providing scope for what the team term "force-based interaction."

The SCALE framework provides three key features: Touch interaction, object status tracking, and motion pattern recognition. The former effectively expands a flat, two-dimensional interaction surface into a 3D volume using a novel algorithm which can detect the particular part of an object being held or what sort of shape outline the object has.

Object status tracking, meanwhile, aims to track the position, weight, and five status conditions — pick, put, move, increase, and decrease — of objects as small as a pen and as large as a human. Finally, motion pattern recognition aims to capture everything from object movement on tables to human movement on the floor.

Following a workshop in which the framework was tested, the researchers came up with a variety of possible applications for the technology: A force-controlled volume slider built into a computer's monitor; a system for tracking how many screws are removed from a draw on a SCALE-enabled shelf; the ability to capture the general shape of an object simply by touching its outer points; retail automation, including selling by weight; a smart workspace capable of tracking the usage and position of tools, including a power drill, complete with reminders if tools aren't put away after use; and position estimation on a load-sensitive floor which can also detect movement, including running.

The team's work was presented at the ACM Symposium on User Interface Software and Technology 2019 (UIST'19), and the resulting paper is now available to download in PDF format.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles