Problem: We wanted to create a live performance piece that would link Ableton Live to Max 7, and we really liked the idea of using the Leap Motion to control our piece.
Solution: We used Ableton and Max to create a song remix with Leap Motion Control, and sound reactive visuals that are controlled by a MIDI controller.
We aimed to create a live performance piece using Ableton Live by piecing together a song using parts of many other tracks, and Max 7 to create graphics that would react to the music we created. We were looking forward to using a Leap Motion controller to control both the music and visuals, but instead we used a normal MIDI controller for the graphics since it worked better with Windows.
For the audio portion, we explored all the possibilities from Vocaloid songs to hip hop to studio Ghibli soundtrack and then we decided to settle with the interactive part being produced by controlling the volume using a leap motion. For the visual aspect of the piece we began with a disk formed color visual, then moved onto a circle with the disk formed color superimposed, to a disk formed color visual controlled by a MIDI controller.
When we was learning how to first use Ableton, we started creating a song using the piano pieces from Vocaloid songs to teach myself the software. After spending two days on it, we realized it sounded quite good and started working towards it as being an iteration. But as we continued, it sounded too much chopped up and we decided to change the way we merged them. We used half the songs and tried again. Then, we realized we didn’t like the effect it produced on the audience and decided to change the genre of song we used.
The first take of the sound reactive visuals was a modified version of one of the lessons built into Max. We changed the shapes that were created by the visuals, and we also changed the material that the mesh was made of. This allowed us to create nice looking patterns that reacted to the sound of the music, but it was fairly basic and flat looking.
We took a new approach when it came to cutting these audio clips to remixing them together. We used Hip Hop songs since it was more user friendly than Vocaloid and made more sense, for the other music was a weird mix of classical and hard rock which sounds great with singing; but not too well on its own. We went through each song and had the lyrics next to me, cutting it at the end of a verse or chorus. This would make it very easy to mix it together, but it turns out that Myles doesn’t like this type of music, which is understandable, so we started brainstorming on possible music to use.
For the second version of the graphics we chose to work on visuals that were far more advanced and had 3d effects to them. We combined two of Masato Tsutsui’s sound reactive visual effects patches by playing them both in the same window. This allowed us to create visuals that reacted to music on top of a rotating sphere with colored animations that would react to the sound of the audience. While this looked really cool, it was very busy, as well as being complicated enough that every time we ran the patches there would be a new problem.
We wanted to go back to a more classical piece, since when used can produce a beautiful melody that fit together easily and are pure bliss to the ears. We started remixing them together and learnt new skills, like how to fade pieces together. This time, the cuts weren’t as easy as just listening to the end of a verse, but these pieces did have quite definite ends of sections. We liked it the best because the songs were slow and graceful while still providing happiness. It provided a great atmosphere for our audience and sounded easy for the ears. We then decided we wanted to add an interactive part to this song. We had been playing with leap motions at lunch and it struck us that we could use this in our final piece. We hooked it up so that the volume and tempo of the piece could be controlled by moving your hand up and down or back and forth.
For the Third and final iteration of the visuals we chose to go back to working with the original patches we had been modifying. We looked deeper into the settings for this iteration in order to create more complicated and interesting designs than we had the last time we worked with the patch, and we also were able to match certain settings to knobs and faders on a MIDI controller. During a performance we are able to change the size, shape, intricacy, and colors of the graphics to draw the audience into the piece.