Editorial Note: In this guest post, MJ Jacob discusses how he used Magenta models in the process of creating his new EP.
My name is Michael “MJ” Jacob, I work at Google as a Cloud Customer Engineer during the day, and by night, I’m a Hip-Hop rapper/producer/engineer that goes by MJx Music, where I’ve now been able to accumulate a few million streams across Spotify and Apple Music alone. Very grateful for those who listen and support me, and to be able to create music that’s filled with purpose.
Hip-Hop was my first passion that I fell in love with. Technology came next. I wanted to learn more about Artificial Intelligence, and after stumbling upon what is possible through Magenta, I immediately became inspired to create Hip-Hop music that has been co-produced and inspired by Machine Learning. Before I got started, I had no idea how this would be possible, what this may sound like, and if it would end up resulting in anything meaningful. Having no true “end-goal” allowed me to be even more curious and inspired me without limits.
6 months into the learning process, I quickly learned, there is an ability to create dope sounding Hip Hop instrumentals with Machine Learning, and I became extremely interested in the concept of creating a Hip-Hop EP where every track was co-produced by ML.
- Give myself a personal project with clear and tangible motivation to grow deeper in the Machine Learning space
- Inspire Hip-Hop lovers to get involved with technology through an interesting / engaging project that relates directly to the audience
- Better understand the intersection of Machine Learning and instrumental music creation to further empathize with the existing possibilities and the current limitations
- To give back to the genre of Hip-Hop in a creative way, using a new method of creating instrumentals that has not been explored much within the genre
About the EP
The EP is called Natural Causes. Given the nature of algorithmic inspiration within the major melodies and drum patterns of every track, I wanted to juxtapose the topic of Machine Learning with a very personal story, which is my life journey to-date.
My heritage stems back to Kerala, India - where my mother took the major risk to leave her homeland to come to America in hopes to provide myself and siblings with a better life. With risk, of course, comes the chance for things not working out the way one might expect, which is exactly what happened back in 2007, my father was arrested and handcuffed directly in front of me, in my own home. I was shocked, confused, and lost.
This incident had completely shaken up my family. I was only 13, my mom had to figure out how to get a job to support myself and my two older siblings. This was incredibly challenging for someone like her, being in a world foreign to her where English by no means was her first language. We had to foreclose on the house we were living in, sell almost everything we owned, and ended up moving into the basement of Section 8 housing for a few years.
In the midst of these times, I fell deeply in love with Hip-Hop culture and Rap. The messages and mentality that “no matter what cards you are given in life, if you stay motivated, focused, and hopeful, there is the ability to make a better situation for yourself” was a concept I fully bought into, thanks to my favorite artists I also saw as mentors (s/o Kanye West, Eminem, Wiz Khalifa, Lecrae, the list goes on).
Natural Causes highlights my story, and my family’s life, from the beginning risks, the middle unexpected and difficult times, to finally being able to make it through the many unrealistic pressures of life, that ultimately formed something beautiful within our family.
All of the Machine Learning training and inference work was done using Magenta. The models I used were melody_rnn and drums_rnn. The reasons for these models after vetting out the options on GitHub, I quickly found these two models helped me get to the true foundation of a Hip-Hop instrumental by having  a catchy lead melody and  have a lead drum pattern for snares/kicks/hi-hats.
I decided to train my own models using my own datasets for both the melody and drum models. The data I used came from my tenure as a Hip-Hop artist over the last 10 years, and collaborating with my friend producers 100Graham, B. Kim, and Wes Harris. Once I had collected all of the MIDI data, I converted the files into NoteSequences, and then trained both of the models, which took about 10 - 12 hours each, and on average achieved the most desirable metrics/performance around 15,000 epochs. I iterated on the models 20+ times, primarily focusing on cleaning my MIDI dataset to get the exact sound/style I was looking for. Once the melody / drumbeat models were capable of outputting MIDI that had me inspired, I would then filter through 100s of outputs to hand select the proper snippets within the MIDI outputs that I wanted to use for my EP.
Take a listen to the following audio to understand the 5 step process going from model outputted MIDI, to a completed song, highlighting one of the songs on my EP called ‘Unexpected’:
I’m really proud of what I’ve been able to accomplish with this EP. I hope that you, who are reading, can enjoy it as a consumer of music, and can also be inspired to understand how Machine Learning can co-exist in a world with artists in a way that is not overly intrusive, or replacing human producers in the process. As you can probably tell from the reference audio above, it is very clear a lot of human choices are needed to take inspiration from a Magenta model output, and turn it into a song that you would truly “enjoy listening to in your car”.
I’m looking forward to continuing to explore this space, feel free to connect with me on Instagram (mjxmusic), Twitter (mjxmusic), and email (firstname.lastname@example.org) if you have any further questions or ideas you’d like to run by me.
This project could not have been made possible without:
- The entire Google Magenta team, who have created a platform that has inspired me to do this project
- Human Co-producers: 100Graham, thereubiverse
- Cover Art Designer: Mitch Phillips - designedbymitch
- Video Producer, Director, Editor: Tyler Beatty - beattyfish
- Producer, DP, Gaffer: Caleb Gritsko - calebgritsko_dp
- Audio Engineer: JXHN PVUL
A very special thank you to my wife, and my family for believing in me enough to follow my passions every chance I get.