"Previewed Reality" Uses AR, VR to Help Humans and Robots Coexist Peacefully and Collaboratively
Combining a range of sensors with augmented reality, Previewed Reality gives users the gift of foresight — and makes in-home robotics safer.
A team of roboticists at Japan's Kyushu University have developed a system for peaceful coexistence between humans and robots in indoor settings: Previewed Reality.
"In a co-existence environment of a human and a robot, unexpected collisions between the human and the robot must be avoided to the extent possible," the researchers explain in the abstract to their paper, brought to our attention by TechXplore. "In many cases, the robot is controlled carefully so as not to collide with a human. However, it is almost impossible to perfectly predict human behavior in advance. On the other hand, if a user can determine the motion of a robot in advance, he/she can avoid a hazardous situation and exist safely with the robot."
"In order to ensure that a user perceives future events naturally, we developed a near-future perception system named Previewed Reality. Previewed Reality consists of an informationally structured environment, a VR [virtual reality] display or an AR [augmented reality] display, and a dynamics simulator. A number of sensors are embedded in an informationally structured environment, and information such as the position of furniture, objects, humans, and robots, is sensed and stored structurally in a database. Therefore, we can forecast possible subsequent events using a robot motion planner and a dynamics simulator and can synthesize virtual images from the viewpoint of the user, which will actually occur in the near future."
To prove the concept, the team built a hardware and software implementation of what they refer to as an "informationally structured environment" or ISE. The software side is ROS-TMS, the robot operating system town management system; the hardware side a Big Sensor Box, or B-Sen, which includes eighteen optical trackers, nine Kinect depth-sensing cameras, several laser rangefinders, and a number of radio-frequency identification (RFID) tag readers, all installed in the bedroom, dining room, and kitchen of a house. The house's occupants, meanwhile, were given either a Microsoft HoloLens or Oculus Rift to act as an augmented or virtual reality window to the Previewed Reality system.
As the user interacts with the robots, they are kept informed not only of the robots' current location but their future location — as well as the future outcome of any tasks the robots are carrying out, such as delivering a snack. The PhysX physics engine provides the outcomes, to the point of predicting that a snack held in mid-air will fall if dropped - giving the user a preview of the action before it occurs and allowing them to position their hand ready to catch. Collision detection is also implemented, turning the robot's future position red if there is a collision risk detected.
The team found that the system worked well, though the HoloLens variant —dubbed Previewed Reality 2.0 — worked better than the Oculus Rift implementation. "In the future," the researchers conclude, "we intend to build a simpler and easy-to-use system for Previewed Reality. Although the current system uses a goggle-type immersive VR display or an AR display, we are developing a new system using a smartphone, which is low cost, lighter, and easy to use. We believe that these systems will make the use of the proposed Previewed Reality more realistic in daily life in the near future."
The full paper is available under open access terms in the journal Advanced Robotics.
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.