Melody Mixer: Using Deeplearn.js to Mix Melodies in the Browser

It’s becoming easier for any coder to tinker with machine learning, even if you’re not a machine learning expert, thanks to tools like Deeplearn.js. In this article, we’ll show you an example of what Deeplearn.js makes possible through a tool called MusicVAE. We’ve combined MusicVAE with easy-to-use web tools P5.js and Tone.js to explore musical melodies in a new experiment called Melody Mixer.

This experiment came from a simple idea: what would it sound like to blend between two different musical melodies? What if we asked a computer to start with one melody and end with the other? What melodies might we discover along the way?

Melody Mixer uses MusicVAE, a web framework released by the Magenta research team at Google. It lets us combine and transform two different melodies by blending them at any percentage. MusicVAE does this by running a deep neural network locally in your browser, using Deeplearn.js.

Open this link to try out out Melody Mixer. First, listen to the two separate melodies. Next, click and drag the melodies apart, and listen to what you get. The computer has morphed from one melody to the other — using musical knowledge it has learned from analyzing 28 million different melodies — all realtime, locally in your browser. (If you want to dive deeper into the machine learning details of how that happens, check out this blog post).

Next, try changing melodies using the menus. It’s fun to try different combinations. And, if you select “Generate,” the computer will instantly generate a new melody right in your browser, using MusicVAE. (Fun tip: You can press your enter key as a shortcut to generate new melodies.)

OK, now let’s walk through how this project was made. We’ve created some demos that make it easy for you to get started building your projects. First, we’ll learn how to set up MusicVAE, then play back the melodies using Tone.js, and finally how to create the interactive component using p5.js.

You don’t need to have any experience with machine learning to do this, but you should have basic knowledge of web development.

Tools we’ll be using

P5.js: A library that helps make coding accessible for artists, designers educators and beginners. We’ll be using P5 to create interactive graphics.

Tone.js: A framework for creating interactive music in the browser. We’ll be using this to play back the melodies in the browser.

DeepLearn.js: A hardware-accelerated machine intelligence library for the web. Essentially this means we can write javascript commands that will run directly on the GPU. We won’t be working directly with deeplearn, but it’s the underlying tool that allows MusicVAE to run a machine learning model in real time in the browser.

MusicVAE: Magenta’s new library for blending between sequences.

Demo1: How to use MusicVAE

In this demo we’re going to walk through how to initialize MusicVAE and interpolate between two melodies. Step one is to send MusicVAE the notes for each melody. We’ll start off by exploring the note sequence format that MusicVAE takes and highlighting a couple of helper functions that make the process easier.

Note sequences

Let’s start with a little melody. Here’s how it looks like as sheet music:

You can hear it here:

And here’s the same melody visualized as a piano roll:

Now, here’s how this melody is passed into MusicVAE. For each note in the melody, we specify the pitch (notice how they correspond to the piano roll above) as well as the timing, indicated by start and end slots.

MusicVAE expects us to cover 32 slots, but notice how some of them can remain empty, creating a silent moment in the melody. Also notice how notes that are held longer use up more than one slot. In our example above, we’re only passing in nine notes, but some are longer, so we’re using 32 total slots.

Pitch-wise, MusicVAE allows us to specify 88 notes — just like on a piano. The lowest possible note is 21, and the highest is 109. MusicVAE is monophonic, meaning we can only play one note at a time. Length-wise, the shortest a note can be is one of the 32 potential slots, and we can give a note any duration as long as it doesn’t overlap with another note, or extend beyond slot 32.

Now that you’ve seen the structure of the note sequence, let’s feed the melody to MusicVAE and get back a morphed melody.

Initialization

With musicVAE is initialized, we can now use the interpolate function. We do this by passing in an array containing the two melodies that we want to blend, then specifying the number of interpolations we want back.

Let’s start by creating two melodies, just like the format above:

The smallest number we can use for numInterpolations is 2. With this, we would get back our original two melodies. If we set numInterpolations to 3, we get back MELODY1, then a new melody blended halfway between MELODY1 and MELODY2, followed by MELODY2. As we increase numInterpolations, we get back more sequences in between MELODY1 and MELODY2, which creates a smoother transition.

Next, try changing numInterpolations using the steps in Demo 2 and see how the notes change.

Demo 2: Displaying the results

At this point the interpolatedNoteSequences variable holds 3 melodies in this sequence:

  1. Melody1 (just as we input it)
  2. New blended melody
  3. Melody2 (just as we originally input it)

Let’s use p5.js to plot out all three melodies on a piano roll. (In the following step we’ll create a playhead to play the music!)

Notice that you can ask MusicVAE for more blended blocks (each one with 32 slots), which will create an even more intricate blend between the two original melodies. We do this by changing the numInterpolations variable.

Demo 3: Playing the Blended Melodies with Tone.js

Now that you’ve seen how to set up and interpolate melodies using musicVAE, let’s get it to play back the notes in the browser using Tone.js. We have recorded piano notes that correspond to the midi notes 21–88, so we can easily make our own piano in the browser using the sampler from Tone.js. (Alternatively we could create a synthesizer to play back our melodies.)

Next we will update our draw function to include the playback of our current note. Every time draw is called, we will use Tone to calculate where we are in the playback, as a percentage. We will then map that percentage to the current sequence and the current step within the sequence. Lastly, we will find the notes in the data and call playNote to send it to Tone.js.

Demo 4: Adding a bit of interaction sparkle

Demo: g.co/melodymixer

Repo: https://github.com/googlecreativelab/melodymixer

Next Steps

We highly recommend tinkering with this example code a bit. Break it and make it do weird things!

Playing with the code will likely make you more curious about what’s happening behind the scenes with MusicVAE. To learn more, read this blog post by Magenta, the research team behind this great library.

We also recommend taking time to further explore Tone.js and p5.js, as they each have the potential to do more with MusicVAE — both musically and visually. Here are a couple of places to learn more about them:

P5.js Resources:

Tone.js Resources:

Conclusion and Takeaways

It’s exciting to see how much easier it’s become for any coder to explore machine learning thanks to tools like Deeplearn.js. We’re looking forward to seeing what others do next. If you have something to share, consider submitting it to the Experiments page so we can all learn from each other and spark new ideas for what can be made with these tools.

Special thanks to Kyle Phillips for his help making this code approachable, Adam Roberts for his work on MusicVAE, and Nikhil Thorat and Daniel Smilkov for their help getting this running and optimized with deeplearn.js.

Source: Deep Learning on Medium