Playing music well is hard — plain and simple. It’s not just the dexterity and familiarity with your instrument that makes it so, as that can be learned with enough practice and muscle memory. What’s truly difficult about being a musician is the ability to understand musical structure, and how it’s used to make music sound pleasing to the ear. Some of this can be understood naturally by those with a knack for it (think garage bands), but some of it is a formal learning process (think Baroque).
Zack discovered this when he set out to build an AI companion for playing piano duets. His quest was inspired by an NPR piece on Dan Tepfer’s Player Piano. But, Zack wanted to take a different approach: he wanted his AI to learn his style of playing, and then recreate it in real-time. Well-practiced musicians can do this on the fly, it’s the whole idea behind improvisational jamming. But, how does one teach a computer to do it?
Following along with Zack’s programming log, which reads delightfully like a journal, lends some great insights into the process. You can see Zack’s frustrations and breakthroughs as he works on the problem. His videos recorded after each major change show how the AI piano player improved along the way (and sometimes got worse) as he experiemented with different algorithms.
The whole project was originally programmed in Python, but Zack ended up switching to Go on a Raspberry Pi for faster real-time processing. The programming is still in progress, but you can read through his journal and see how far he’s come. The biggest breakthrough was Zack’s realization that his own piano playing was based on relatively simple rules and structures.
Teaching the AI how to utilize these and improvise based on what he was playing yielded impressive results. While it’s still in progress, Zack is keeping the GitHub code updated, so you can experiment with it yourself. Even better, see if you can improve it with your own ideas!