Apple's Mood Lighting
Apple’s intelligent and expressive lamp — inspired by Pixar's Luxo Jr. — interacts naturally with humans, responding to gestures and more.
All of a sudden — with rapid advances taking place in artificial intelligence (AI), computing, and robotics — it seems as if we are finally on the cusp of a future in which household robots can offer us some meaningful help in our everyday lives. But that future might not (at least initially) look like what we have come to expect from reading science fiction. A humanoid robot that does everything from cooking and cleaning to laundry is still beyond what can practically be deployed in the typical home.
The next steps will much more likely involve giving intelligence and personality to the appliances that we already have in our homes. A team of engineers at Apple is working to make this a reality, and toward that goal, they have developed a framework that they call ELEGNT. This framework was designed to facilitate expressive and functional movement design for non-anthropomorphic robots. Their first implementation of ELEGNT is in a small table lamp that is reminiscent of the one seen in Pixar’s 1986 short, animated film titled Luxo Jr.
Instead of merely following instructions like a traditional smart device, the lamp interacts with users in a way that feels natural and intuitive. Equipped with a camera, microphone, and speaker, as well as tiny motors for movement, the lamp can see, hear, and respond to human gestures and voice commands. If a user gestures for it to move closer, for instance, it will respond by adjusting its position. If asked about the weather, it won’t just check online — it will first “look” out the window, as if demonstrating awareness of its environment.
This approach is based on the concept that movement and expression are key components of human communication. When people interact, they convey emotions and intentions through subtle body language. The researchers believe that incorporating similar expressive elements into robotic appliances can make interactions more natural and engaging. For example, the lamp may pull back slightly when "confused" by a request, or it might tilt its head in curiosity when given a command.
Another interesting feature of the lamp is its ability to proactively assist users. It can automatically adjust its position to illuminate objects that a user is focusing on, without being explicitly asked. And if music is playing in the room, the lamp might even sway or "dance" along with the rhythm, emulating the playfulness of a pet bird bobbing to a tune.
To evaluate their approach, the team conducted a user study in which 21 participants observed the lamp performing various tasks. The participants compared two versions of the lamp — one that moved in purely functional ways and another that incorporated expressive gestures. The results indicated that the expressive version was perceived as more intelligent, engaging, and relatable. Participants were more willing to interact with it and felt a stronger sense of connection to the device.
These findings suggest that integrating expressive movements into future AI-powered appliances would make interactions with technology feel more intuitive. The study also suggested that factors such as age and professional background influenced user perceptions, hinting at the possibility of customizing robotic behaviors to suit individual preferences.
While fully autonomous humanoid robots may still be years away from becoming household staples, Apple’s expressive lamp shows us that in the near future, home robotics might be more deeply integrated into our lives, and more emotionally engaging, than we ever expected.