Welcome to Lo-Fi Player! By interacting with elements in the room, you can build your own custom music room. You can also share your room with others. Or if you prefer, just relax, listen, and enjoy the view from the window. The experience is powered by machine learning models from magenta.js.
For many it’s more fun to play around with the room without an instruction guide, so you might try yourself before reading further. Just click on the image above and Lo-Fi Player will start in a different browser tab. One hint: once the music starts, the green ceiling lamp can be used to stop it.
Customize with Ease
Have you ever listened to Lo-Fi Hip Hop streams while working? Imagine if you were the producer; what kind of vibe would you create for the internet?
“Lo-Fi Player” is a virtual room in your browser that lets you play with the BEAT! Try tinkering around with the objects in the room to change the music in real-time. For example, the view outside the window relates to the background sound in the track, and you can change both the visual and the music by clicking on the window.
We chose Lo-Fi Hip Hop because it’s a popular genre with relatively simple music structure. This limited flexibility helps ensure that the music always makes sense. We’re able to create something more like a “music generating room” than a musical instrument or composition tool.
How Does ML Make Music More Interesting?
We incorporate several music machine learning models developed by the Magenta team to help users make the experience more novel and dynamic. For example, the TV in the center of the room represents MusicVAE. You can use it to create new melodies by recombining existing ones. What does that mean? Imagine creating a new face for a virtual sibling by mixing yours and your mom’s faces, but with music. The radio beside the TV represents MelodyRNN. It’s a small “automatic loom” producing melodies instead of fabric. By clicking on the radio you generate new melodies.
We want to show that something as simple as applying MusicVAE to short melodies can produce pleasing results when done in a creative, fun context. We also tried to design the experience to demonstrate that making new music doesn’t necessarily require expertise.
You might think: this kind of generation will never replace the great producers! We completely agree. The design goal is not to replace existing Lo-Fi Hip Hop producers or streams. Think of it more as a prototype for an interactive music piece or an interactive introduction to the genre to help people appreciate the art even more.
Shared Room on YouTube: Interactive Streaming
We found the social aspect of the project intriguing and decided to create a shared space where people can inhabit the same music room together. So as a second phase of the project, we transformed Lo-Fi Player into an interactive YouTube stream that we’ll leave running for a few weeks. Given how popular YouTube is for Lo-Fi Hip Hop streaming, it felt a bit like bringing the project back home.
The YouTube stream has a different interaction mode. Rather than clicking on elements in the room itself, you instead type commands into the Live Chat. These commands allow you to change the color of the room, change the melody, switch the instruments, and so on. Every time the beat loops, the system randomly selects comments from the live chat to modify the music. Those comments that were randomly selected will be highlighted with a conversation bubble. Even those users who don’t interact with the room are able hear how it evolves as it is modified by chat commands. This is very much a first attempt at ML-powered interactive YouTube streaming. It’s pretty primitive, but we hope it’s still fun to set up a room and let others modify the music being made.
Developers, try It Yourself!
We want to make it easy for developers to build similar experiences with Magenta models. The source code for this project should be helpful, and can be found on GitHub. We also built a js tutorial called “Play, Magenta!” where you can edit the sounds and canvas with the editor live in your browser.
Background, Credits and Thanks
Vibert Thio is a Summer 2020 Magenta intern working with researcher Douglas Eck. The vision and execution for this project belongs 100% to Vibert! Doug co-wrote the blog, helped manage the project and, like others on the team, had many in-depth design and technical conversations with Vibert.
Due to COVID-19, this project was particularly challenging. We were all working from home and coping with many time zones: Vibert worked from Taipei, Doug from Paris and many Magentans from California. Most of the Magenta team members have still not met Vibert in person! Given these circumstances, a shared YouTube music stream generator that bridges physical distance seems like a fitting outcome.
The sunset animated gif was created by the amazing 2D Artist / Animator Sheena Tiong and is used with permission. All other artwork was created by Vibert.
From Vibert: “Thanks to the beautiful people in Magenta for helping me make this project happen, including Fjord Hawthorne, Andy Coenen, Monica Dinculescu and others. Thanks also to Damien Henry from Google Arts and Culture and to Amit Pitaru and others from Google Creative Lab New York. Thanks to Conehead and E V E (a.k.a 酷酷小乖乖) for the counsel on music. Thanks so much to everyone who gave me any and all feedback.”