Today’s virtual reality devices rely almost exclusively on visual and audio feedback when users interact with virtual objects. The only real haptic feedback that is currently on offer is vibration—the same groundbreaking tech found on your ’90s Nokia brick. But, while everyone in the industry agrees that better haptic feedback is going to be key to the success of virtual reality, how best to provide that feedback is very much in debate.
As one of the world’s largest tech companies, it’s no surprise that Microsoft has been working on the problem. What is a little surprising is that they’ve developed four new prototypes with the help of interns from some of the country’s best universities. Each design approaches the challenge in a different way, and seems to be intended for specific kinds of virtual reality applications.
First up on the list is the CLAW—which may or may not be an acronym. Of the controllers that Microsoft is demonstrating, this is probably the most similar to a traditional VR controller. It was created by Microsoft and an intern from Stanford University named Inrak Choy.
The CLAW resembles a joystick, but with an index finger-operated trigger that is mounted on a servo-actuated arm. It’s designed to be used with either a pinching grip or a touching grip. In both cases, the trigger uses those servos to provide feedback in two ways.
The first method is to move the trigger against your finger as you touch a virtual surface. For example, if you run your quickly finger across some digital sandpaper, it vibrates harshly to simulate the sensation you expect. The other way it provides feedback is by resisting the pressure you put on it. That resistance is variable, so grabbing a sponge feels distinctly different than grabbing a brick.
This next one is probably the most unconventional of the group. The Haptic Wheel was made with the help of intern Eric Whitmire from the University of Washington, and as its name suggests, it’s built around a wheel. That wheel is mounted on a handle which has two degrees of movement: rotationally around the wheel’s axis, and perpendicularly to that axis towards the user’s index finger.
Both of those movements are motorized, and are used to simulate the feel of various surfaces on the user’s finger tip. When they touch a virtual object, the wheel is moved out to touch their finger. The wheel can have multiple textured surfaces (and wheels can be interchanged), so a specific texture can be moved against the user’s finger tip. Then, as they move their finger in the virtual environment, the wheel rotates as necessary to emulate the feel of the appropriate texture.
Both the CLAW and the Haptic Wheel are intended to be used with a single hand, but intern Evan Strasnick—also from Stanford—came up with a two-handed setup called Haptic Links. This is actually three different closely-related prototypes, all of which rely on altering the degrees of freedom between two handles. By limiting that freedom in specific joints, it can replicate the feel of handling particular digital objects.
The first of those, called Chain, connects the two handles by a series of ball joints. A cable runs through those joints that can be tensioned on demand in order to increase the friction of the joints. That provides the ability to freeze the handles in place (in relation to each other), which could make it feel as though the user is touching a rigid object.
The second design is Layer-Hinge, which integrates a friction brake on arms between the two handles and actuated set-screws in ball joints attached to the handles themselves. That allows selective locking of each joint. Taking that one step further is the Ratchet-Hinge prototype, which replaces the friction brake of the Layer-Hinge with a braking ratchet. The ratchet can be locked in either direction (or both), which allows for even more granular restriction of movement.
The last controller—Canetroller—stands out because it has been created specifically with blind users in mind. This one was made with the help of interns Yuhang Zhao from Cornell University and Cindy Bennett from the University of Washington. It brings the white cane into the virtual world, which gives many blind users the ability to navigate in a way that they’re already accustomed to.
The Canetroller itself is a shorter than usual cane that has been fitted with a brake, vibration capability, and 3D auditory feedback. The user can sweep the floor and other surfaces, and tap walls and objects. If the virtual cane makes contact with a virtual object it simultaneously activates all three feedback mechanisms.
If the object is solid, the brake (which is connected to a harness worn on the waist) is actuated to stop the cane from moving further. The cane is also vibrated in a way that is appropriate for the surface or object it’s contacting. Finally, a suitable sound is played from the correct point spatially. Testing showed that blind users were able to quickly and accurately navigate a virtual room in much the same way they would in the real world.