Smart Glasses That Take Charge of Your Day

Unlike a smartphone or chatbot, these glasses continually listen in to proactively provide the user with helpful information and assistance.

nickbild
over 1 year ago Wearables
These smart glasses leverage contextual information to take proactive actions (📷: Cayden Pierce)

Smartphones provide us with some of the greatest conveniences that we have in our everyday lives. They assist us with everything from communication and navigation to managing our schedules and accessing a vast array of information and entertainment. Recent advances in artificial intelligence (AI) are being rapidly incorporated into our smartphones in an effort to provide us with more personalized experiences, smarter virtual assistants, and improved efficiency in our daily tasks.

But many people are still searching for a way to more tightly integrate these technologies into our lives. The present paradigm in which we have many different apps, which do not communicate with one another, and which only know what we explicitly tell them is limiting. Furthermore, having to pull out the device and tap through a series of menus or enter a long string of text on a tiny, on-screen keyboard makes them impractical for providing real-time assistance.

Cayden Pierce of the MIT Media Lab believes that there is a better way to incorporate modern technologies into our everyday experiences. Pierce is developing a pair of smart glasses that take over many of the functions of a smartphone and AI assistant, but they do not operate anything like traditional solutions, which tend to feel like you shoved your phone into the frames of your glasses. Instead, these glasses take on a very proactive and contextually relevant role to perform useful functions, and provide information without having to explicitly ask for it.

In order to provide contextually relevant assistance, the glasses capture audio and video to better understand what might be helpful to the wearer at any given time. This is not just in the moment, but rather the glasses can draw on data collected over a period of hours. Pierce noted, for example, that if the glasses learn that the user is shopping for a particular item, they could then overlay reviews of that type of item while they are out shopping. It would not also clutter up their visual field with additional information about products they are not interested in.

That example also hinted at the proactive nature of the operation of the system. Unlike smartphones or web-based AI chatbots, the wearer of the glasses does not necessarily need to explicitly take any action to get assistance. Pierce also mentioned another situation that illustrates this capability. In that case, he was with a friend that was eating lots of dark chocolate late at night. After mentioning to this friend that dark chocolate contains caffeine and that was a bad idea, the glasses overlaid the typical caffeine content of dark chocolate on his visual field.

The glasses can similarly overlay all sorts of useful information that one might find helpful during a conversation. Pierce is also working towards enhancing the functionality to the point that the system could one day automatically do things like check into a hotel and call a cab when one lands at the airport late at night.

Exactly where these glasses may go in the future is still uncertain, but in any case, it is well worth watching the video above to hear Pierce’s vision for the future of smart glasses.

nickbild

R&D, creativity, and building the next big thing you never knew you wanted are my specialties.

Latest Articles