A primary goal of the Magenta project is to demonstrate that machine learning can be used to enable and enhance the creative potential of all people.
The demos and apps listed on this page illustrate the work of many people--both inside and outside of Google--to build fun toys, creative applications, research notebooks, and professional-grade tools that will benefit a wide range of users.
A real-time intelligent musical instrument which combines Magenta’s Piano Genie model with a physical interface consisting of fruit (or whatever else you can dream up)! Developed in partnership with The Flaming Lips for their performance at Google I/O 2019.
MidiMe is a machine learning experiment to train a small model to sound like you. All the training happens directly in the browser using TensorFlow.js – no servers or backends here!
Magenta Studio is a collection of music plugins for Ableton Live built on Magenta’s open source tools and models. It can also be downloaded as standalone, native apps with no additional dependencies.
RUNN = 🏃Run + 🤖RNN. A side-scrolling game where the player has to finish the level to listen to the full song. Each level is generated realtime with a MusicRNN model.
Every time you start drawing a doodle, Sketch RNN tries to finish it and match the category you’ve selected.
Have some fun pretending you’re a piano virtuoso using machine learning.
Converts raw audio to MIDI using Onsets and Frames, a neural network trained for polyphonic piano transcription.
A creative take on a rare electronic sequencer. Uses the Magenta.js to generate drum patterns when you hit the “Improvise” button.
Generate two dimensional palettes of drum beats and draw paths through the latent space to create evolving beats. Built by Google Creative Lab using MusicVAE.
Sketch melodies on a matrix tuned to different scales, explore a palette of generated melodic loops, and sequence longer compositions using them. Built by Google’s Pie Shop using MusicVAE.