Web apps built with Magenta.js
This section includes browser-based applications, many of which are implemented with TensorFlow.js for WebGL-accelerated inference.
A web-based intelligent music application built on MelodyRNN and DrumsRNN, powered by Magenta.js.
A web-based game based on interpolations of melodies with MusicVAE. Listen to the music to find out the right order, or “sort” the song.
Every time you start drawing a doodle, Sketch RNN tries to finish it and match the category you’ve selected.
Have some fun pretending you’re a piano virtuoso using machine learning.
Magenta Studio is a collection of music plugins for Ableton Live built on Magenta’s open source tools and models. It can also be downloaded as standalone, native apps with no additional dependencies.
Converts raw audio to MIDI using Onsets and Frames, a neural network trained for polyphonic piano transcription.
Generate two dimensional palettes of drum beats and draw paths through the latent space to create evolving beats. Built by Google Creative Lab using MusicVAE.
Sketch melodies on a matrix tuned to different scales, explore a palette of generated melodic loops, and sequence longer compositions using them. Built by Google’s Pie Shop using MusicVAE.
An interactive AI Experiment based on NSynth made in collaboration with Google Creative Lab that lets you interpolate between pairs of instruments to create new sounds.
An interactive AI Experiment based on MelodyRNN made in collaboration with Google Creative Lab lets you make music through machine learning. A neural network was trained on many MIDI examples and it learned about musical concepts, building a map of notes and timings. You just play a few notes, and see how the neural net responds.