Traditionally, the only way to learn how to play the Piano was to buy a keyboard and go through the grueling process of learning how to read sheet music, which note corresponds to which key on the piano, and the muscle memory of the chords. The goal of this project was to create a piano learning experience that would make all the pitfalls of learning a traditional piano a bit easier. We have made a VR piano with a haptic glove add-on for this piano learning experience. A virtual piano in VR will be presented with notes/chords being marked on the keyboard. The haptic glove will aid with piano learning by moving the fingers up and down based on note sequences to teach muscle memory for specific chords/songs. This program is aimed for beginners who want to break into learning piano but have limited resources for a real instrument and piano lessons, or for those who want to try out before deciding to commit. The system has the potential to be more accessible than a traditional piano as it does not require as much space.
DemoMilestone 1At the conception of our project, we were actually unsure of where to take our ideas. As a group we were pretty set on using VR or AR technology in combination with haptics to create an immersive experience, but were unsure of how we wanted to apply those technologies in a meaningful way. We wanted to experiment with the applications of VR and haptics in regards to music and instruments, and came up with a few ideas.
Our first idea was a haptic backpack that would contain a sludge like material, with the sludge moving around the body in accordance with the music tempo. We were concerned about defining movements based on the tempo of music as well as the accuracy of the haptic feedback being inaccurate due to being placed on the back.
The second idea we had was an abstract VR rhythm game that provides haptic feedback to the user based on the song being played. We wanted to make haptic feedback shoulder pads, wristbands, and elbow pads to give a total sense of haptic feedback across the body. We had similar concerns with this idea as the previous one, in that the haptic feedback may not be accurate or palpable enough to get an immersive feel to it.
We settled on the idea of a VR piano tutor, capable of presenting the user with a VR piano to play and giving feedback based on when they pressed a key. It would use a glove to administer feedback and guide the user’s hands to play the correct notes/chords with the correct fingers. A small feedback device would be placed on the fingertips to indicate when a key is pressed, with a motor hooked up to the glove to position the fingers vertically.
It was after receiving feedback on our ideas that we took into consideration (and eventually ended up applying) the idea of an electric muscle stimulator (EMS) to guide the fingers.
Early workThe gloves are meant to hold the fingers back when the EMS is activated
Steps to restrain fingers:
- A servo motor is used to rotate a piece of string tied around the servo horn
- The string is connected to a ring that can be fastened around the user’s finger
- This restricts the user’s range of motion, so that they can’t bring the targeted finger all the way down
At this stage, we had the VR environment setup and enabled hand tracking within the Meta Quest 2 with finger precision to interact with objects in the virtual space. We also Imported a pre-made Piano prefab to use as the base keyboard. We edited and added sound to the prefab to have individual pressable keys that can interact with the virtual hands/fingers from the headset built-in hand tracking.
For the wearable device prototype, we researched and tested where to place the EMS pads on the hand and arm to target specific fingers and nerves. We also tried to use the EMS to rotate/move the entire arm for additional sensory indication of where a user‘s hands should be. At this stage we had all the Headset and the Wearable Prototype working individually to some extent but not yet communicating with each other.
Our goal was to make a wearable device that was capable of moving individual fingers up and down in accordance to the VR Scene in order to teach muscle memory for different chords and notes. To do this we used a number of devices such as servo motors and an Electrical Muscle Stimulation (EMS) device.
We used five servos, one for each finger on a hand. Each servo motor was attached to a fishing line that was attached to a 3D printed ring on each finger. This allowed the the servo motors to move each individual finger from a resting position to a pulled back/lifted position. We used this to stop certain finger from going down and pressing keys when they were not supposed to.
We used the EMS device to bring certain fingers for specific chords down in order to help the user know which fingers are supposed to be used and when. To do this, we took an off the shelf TENS unit and modified it to work with our ESP32. We used four, 3.3 volt relays to activate and deactivate switches on the EMS device. This allowed us to control how strong of an electrical signal to send from the EMS device.
In conjunction with both the servo motors and the EMS, we were able to control a person’s fingers and hand as needed to teach them the muscle memory for a song and chords.
We then used a VR headset to control the wearable device according to the view on the scene. The VR headset and the ESP were able to talk to each to each other using a Peer to Peer network wirelessly, allowing us to send the instruction from the VR headset to the ESP32, effectively controlling the servos and the EMS. We used Wi-Fi as the protocol for this network since the application on Unity would need to be built to the headset.
We set up a Wi-Fi hotspot and connected the VR headset to it. The ESP32 would connect to the hotspot and then start a web socket server on port 81. We used the Arduino WebSockets library (Arduino Web Sockets) ****to create and host the server. Since the VR headset is on the same Wi-Fi connection as the ESP32, it could connect to the web socket directly using the ESP32’s IP address and port. We used the Native WebSockets library (Native WebSockets) for Unity to connect to the web socket and send messages. From there, the VR headset could send strings of text to the ESP over the web socket.
The VR scene is completed by adding a “Practice” mode which allow users to see the notes sequence that needs to be played. There is a falling key visualization to show the user which note to press, when, and for how long. During practice, users can see the piano keys turn to green if they hit the right key, and turn to red if they hit the wrong key.
The VR headset decides what to send based on the “note sequence”. In Unity, specific keys on the piano are assigned to be the sequence to be played, along with the length of each note. The note sequence script gets these notes as the “expected note name”. When the sequence starts, it gets the name of the current expected note, sends the name to the ESP, and then waits for the specified amount of time before continuing on the next note. After every note in the sequence has been sent, a stop command is sent to the ESP and the scene returns to the “start scene”.
On the Arduino side, the code stores the most recent sent value in a global string. When the loop detects that a string representing a note has been sent, it activates the play function. The notes from the “Practice” mode are mapped to the finger that should be used to play that note. The play function then activates the EMS at intensity level 1 to bring down the player’s fingers, and the servos pull back the fingers that are not supposed to be played. After the function is called, it resets the global string variable to empty so that the loop does not activate a finger more than it was supposed to be. When the code receives the stop command, it resets all of the servos and sets the EMS back to level 0.
Related Products/ InspirationDextrEMS: Increasing Dexterity in Electrical Muscle Stimulation by Combining it with Brakes :https://dl.acm.org/doi/10.1145/3472749.3474759
Electrode placement on the forearm for selective stimulation of finger extension/flexion:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5764314/
Unity Piano Prefab - Unity Asset Store
https://assetstore.unity.com/packages/3d/props/15-low-poly-models-202061
Arduino Web Sockets:
I used this Arduino library in order to create a server for the VR headset and the ESP32 to communicate over https://github.com/Links2004/arduinoWebSockets/tree/master
Native WebSockets:
https://github.com/endel/NativeWebSocket
Future Goals- Several music sheets will be embedded into the program, and users can choose which sheet to practice at which tempo (slow/fast).
- Haptic feedback on finger tips on touching the key press
Comments