I’m a musician and a creative technologist with Google’s Pie Shop, an experience design studio tasked with translating the complex concepts behind emerging technologies at Google into tangible exhibits. For the last year or so I’ve been thinking about and designing tools that help musicians make use of Magenta’s musical models.

One of the projects I worked on is Latent Loops – an interface that turns Magenta’s MusicVAE into a playable electronic instrument.

The project began as a browser based tool, but this summer the Pie Shop team and I also turned it into an interactive installation in the form of a latent space of melodies that you can walk on.

As a musician – someone who spent a lot of time studying and attempting to master music theory – I was initially very skeptical about applying machine learning to music. However, as a technologist and composer who uses computers as part of my music making, I saw pretty quickly how artistically interesting the idea of a musical palette could be.

ML as an Educated Collaborator

Making good music is hard.

We’ve come up with rules to describe what we think makes music good – music theory (of varying traditions). We can describe and dissect great pieces of music in terms of these rules. We can come up with millions of musical ideas that adhere to them exactly, but they won’t all be that great.

Most people (even a lot of well respected musicians) don’t know a lot of these rules, let alone how to read sheet music, but almost anyone can recognize a great melody when they hear it. I’ll talk from here on out specifically about melody, as this is what Latent Loops is focused on, but I believe this applies to other aspects of music as well.

As composer Paul Hindemith wrote in The Craft of Musical Composition, “In no field are taste, musical culture, and genuine inclination or the lack of it more important than in melody.”

This is not to say that anyone can come up with a brilliant melody regardless of musical training. The point, I believe, is that taste and intuition is the guiding light that musical training, technology, and everything else serves to support.

Latent Loops uses MusicVAE’s melody model which is trained on millions of melodies. It’s derived its own understanding of music theory from these examples, which means it’s good at generating ideas that roughly follow these rules. It’s important to clarify that the model was underfit during training, which makes it possible to generate outside of the rules, but what I’m getting at is that the model has had a musical education and knows something about how music works. It can come up with lots of options a musician can pick through, tweak, and use as they see fit. It can also help new musicians refine their ideas so they make more musical sense. The ML model acts as an educated collaborator – a bandmate who you can riff and bounce ideas around with.

This mirrors what I think is so fun about making electronic music. Regardless of whether you’re using hardware or a DAW, a big part of your time is spent twisting knobs, flipping through different sounds, samples and settings, picking out things that sound cool then tweaking them to serve your purposes. Here, we’re manipulating the actual notes in a musical score, but doing it the same way we might work with a sample, synth patch, or effect.

Techniques for Composing with Latent Loops

A good melody has a balance of contrasting elements. A balance of the expected and the unexpected. Repetition and variation. Tension and release. We like hearing something, recognizing it when we hear it again, then being surprised when it changes.

We like melodies that use sound and silence, so the rests are just as important as the notes. We like a balance of repeated notes, scalewise motion and leaps. We like a little bit of dissonance with our consonance, and a little bit of syncopation (where the beat is slipped just before or after we expect it) with our regular rhythms.

A good composer takes all these things into account (consciously or subconsciously) when they sculpt a perfect melody.

A tool like Latent Loops is interesting because it lets you think about composition differently. It allows you to follow your intuition and taste. Rather than composing a single balanced melody, we can construct a palette that contains these musical poles in varying proportions, and see what happens in the blended space between them.

For example, in the following video I create a palette with a bit of scale-wise motion by sketching a scale in the top left corner of the palette. Then add some leaps in the top right. I introduce syncopation with a sparse melody that emphasizes offbeats in the lower left. And add regularity with a repeating pattern in the lower right. The interface sends MIDI data to my DAW, so the synth sound you hear is coming from there.

This matrix is tuned to a pentatonic major scale by default (you can choose to use different scales in the settings panel), which I like because it allows you to think gesturally. Even if you squiggle randomly, the notes will all sound relatively harmonious together.

As I listen around the palette, some of these cells aren’t great. Some are interesting, and I drag those over to the clipboard. Some are almost interesting but have some element that I don’t like, so I double click into the editor and tweak them until I do like them. This is a tool for musicians, there’s nothing precious about the melody suggestions from MusicVAE. They’re just a starting point.

Another way of using the palette is to use MIDI clock sync, which lets you play it as a live instrument in sync with tracks you create in your DAW. In this video I’m playing the lead synth by exploring the pentatonic minor palette I created in my browser, while triggering drum loops composed in Ableton using an MPC.

Latent Loops Installation at Sónar

Latent Loops is designed for electronic musicians and fans, so the Pie Shop team and I created a room scale version of the project that debuted at the Sónar Music Festival in Barcelona this summer.

We stripped down the interface and interpreted it as an interactive installation that turns the idea of a melodic palette into a playable physical space that people can explore with their bodies. Participants craft and edit seed melodies on touch screens at the corners of the floor, then collaboratively perform a piece of music by walking across the palette together. Multiple participants create longer phrases – each player gets a count of eight to be in control before playback bounces to the next. The installation and music from the event are captured in the following video.

Empowering New Kinds of Creators

New creative technologies should not aim to replace humans, and that’s not what the Magenta project is working towards.

Drum machines didn’t put drummers out of business, even though people were skeptical of them at first. They also didn’t make it so that every person on earth could construct a sick beat. They did, however, enable new types of musicians to express themselves in new ways. And have a huge impact on the way music sounds today.

A new technology in and of itself is of no particular artistic value. If it is adopted by artists, however, the new types of work and new types of artists it empowers can have an everlasting effect on the future of that artistic medium. It’s very exciting to be a small part of something that cool.

What Next

Acknowledgments

Thanks to Adam Roberts and Jesse Engel for their work on MusicVAE and help using it. To Zach Schwartz, Harold Cooper and Alvin Yeung for their work on the Latent Loops interface. To Harold Cooper, Cynthia Le, Diana Huang, Mo Adeleye, Jim Slater, and Brooklyn Research for their contributions to the installation. And to Jill Van Epps and Aaron Cassara for documenting.