Web apps built with Magenta.js
This section includes browser-based applications, many of which are implemented with TensorFlow.js for WebGL-accelerated inference.
Converts raw audio to MIDI using Onsets and Frames, a neural network trained for polyphonic piano transcription.
Generate two dimensional palettes of drum beats and draw paths through the latent space to create evolving beats. Built by Google Creative Lab using MusicVAE.
Sketch melodies on a matrix tuned to different scales, explore a palette of generated melodic loops, and sequence longer compositions using them. Built by Google’s Pie Shop using MusicVAE.
An interactive AI Experiment based on NSynth made in collaboration with Google Creative Lab that lets you interpolate between pairs of instruments to create new sounds.
An interactive AI Experiment based on MelodyRNN made in collaboration with Google Creative Lab lets you make music through machine learning. A neural network was trained on many MIDI examples and it learned about musical concepts, building a map of notes and timings. You just play a few notes, and see how the neural net responds.