Interactive Music + Art

Final: Jeopardy Theme Song

Jakob Sperry and Ethan Wood

Process Post

Jakob Sperry and Ethan Wood

            In 1928, Leon Theremin invented the first Theremin, which was said to be the first electronic instrument created. The Theremin is a very simple and unique instrument, which consists of two antennae that protrude from a box, and control pitch and volume. The distance your hand is from each antenna, determines the volume and pitch of the note created. Of all the thousands of instruments created, it is the only physical instrument that can be played without touching. The leap motion controller is a revolutionary device that is able to track the movement of your hands. Inside the tiny 3” x 1” prism, is two digital trackers looking for individual hands fingers and motions made by your hands such as closing fists, waving hands and many more movements. With this data, the possibilities are unlimited to what you are able to program. Our group wanted to incorporate these two very cool inventions, one old, one new, and create a combination that is superior to either of the two.

            We started this studio by learning about three programs, which we would be using. Max7 is a visual programming language for media that lets you create and code anything imaginable. Abelton Live is a software music sequencer and digital audio workstation that interacts with Max7. The Leap Motion App features many musical games and using the controller opens up a new world of digital possibilities. We wanted to take advantage of the fact that the controllers could track both hands independently and wanted to create a virtual Theremin. Like any Theremin we programmed the device so one hand was volume and one was pitch. While this was a challenge at first we finally were able to create the program. First, we had to figure out the range. We didn’t want the range to be too big or else the sensitivity your hand would have to move between notes would be minuscule making it hard to play. However if the range was too small, the amount of repertoire a person would be able too perform would be very limited. We settled on a two octave C4 to C6 chromatic scale, which seemed to fit well. The next problem we found was that we ran into was that when playing a piece that involved jumps, you would hear all of the chromatic notes in-between. To fix this, we programmed a simple command which turned the midi piano notes into frequency notes, which is like a traditional theremin. While you are still able to hear the notes in-between, the slides are much less noticable and the instrumetn becomes easier to play.

            The next thing we wanted to makeour Leap Theremin do, was to be able to change instruments. Fo this, we used the midi instruments provided and plugged them into our program. We included over 100 midi instruments from piano, to strings, to drum kits. Finally, we wanted to have the chance to create and play selected tunes much more accurately by choosing the instrument to only play specific notes in its range. To do this, we had first expirmented with just making the instrument play major and minor scales. Next we developed modejulation so that you would be able to change keys. Finally by anaylzing specific small songs like Twinkle Twinkle Little Star, we were able to just select the notes in the peice and spread those six notes throughout the range of the controller. This was useful in getting a very accurate and recognizable peice played. The end result was a master program able to toggle between instruments, frequency notes, midi notes, and even programmable songs.

            While this new Leap Theremin was cool in the way it sounded, we knew from the beginning that we wanted to incorporate a visual piece as this is what Max7 was best at doing. With the leap controller app came a visulal tracker showing the outline of your fingures in a stick and bubble like figure. We loved the simplicity of the look and so decided to make this our visual. When programming it in, we kept all of the colors and the looks of the tracker. We then wanted to create a background visual that reacted to the audio playing, however when we started to add it in, our program became overloaded and as a result, we encountered lag. We really did not like this as it suddenly was very hard to tell what was happening, and so we decided to just stick with the orignal visual which was stable, and had no lag. With our presentation we made the visual piece full screen to empahasize to the audience members the vertical raising of one hand and the horizontal moving of the other hand. For the preformers, the visual and audio breed really added a whole new demension to playing music.

Presentation

Ryan Habermann

Musical Cancer Board

Ryan Habermann

In this project we were instructed to make a visual music piece that somehow included the theme of health. Having a relative who fought through stage four cancer, I felt the urge to include cancer in my project. To do this I felt I should attempt to create an educational piece built around cancer statistics. To accomplish this I first had to do research to gather statistics. I did all of my research off of the national cancer institute 2016 PFD to ensure my information was correct. I chose five different themes and was able to create a functional musical board through max 7 software. The board was five by four, equaling twenty total buttons that would activate a musical track. For each of the five columns two descriptions were given for each. The first description given was the most common cancer for men, which provided instrumental tracks. The second description  was the most common cancer for women, which provided acapella tracks. The third description was the best ways to prevent cancer, and this row provided guitar and bass tracks. The fourth description was the states that had the highest cancer ratings, and provided drum tracks. The fifth description was the most deadly cancer which also provided drum tracks. This created for a customizable board so the user could create a musical track of their choosing while, being forced to learn a cancer statistic for that button. I chose the background color to be lavender because it is the national color of cancer.  I created the different types of instruments and vocals to ensure that the user could create a full sounding mix.

Final

Myles Lack-Zell and Grace MacPherson
1 / 3

Problem: We wanted to create a live performance piece that would link Ableton Live to Max 7, and we really liked the idea of using the Leap Motion to control our piece.

 

Solution: We used Ableton and Max to create a song remix with Leap Motion Control, and sound reactive visuals that are controlled by a MIDI controller.


 

We aimed to create a live performance piece using Ableton Live by piecing together a song using parts of many other tracks, and Max 7 to create graphics that would react to the music we created. We were looking forward to using a Leap Motion controller to control both the music and visuals, but instead we used a normal MIDI controller for the graphics since it worked better with Windows.

 

For the audio portion, we explored all the possibilities from Vocaloid songs to hip hop to studio Ghibli soundtrack and then we decided to settle with the interactive part being produced by controlling the volume using a leap motion. For the visual aspect of the piece we began with a disk formed color visual, then moved onto a circle with the disk formed color superimposed, to a disk formed color visual controlled by a MIDI controller.

Process

Myles Lack-Zell and Grace MacPherson
1 / 7

Problem: We wanted to create a live performance piece that would link Ableton Live to Max 7, and we really liked the idea of using the Leap Motion to control our piece.

 

Solution: We used Ableton and Max to create a song remix with Leap Motion Control, and sound reactive visuals that are controlled by a MIDI controller.


 

We aimed to create a live performance piece using Ableton Live by piecing together a song using parts of many other tracks, and Max 7 to create graphics that would react to the music we created. We were looking forward to using a Leap Motion controller to control both the music and visuals, but instead we used a normal MIDI controller for the graphics since it worked better with Windows.

 

For the audio portion, we explored all the possibilities from Vocaloid songs to hip hop to studio Ghibli soundtrack and then we decided to settle with the interactive part being produced by controlling the volume using a leap motion. For the visual aspect of the piece we began with a disk formed color visual, then moved onto a circle with the disk formed color superimposed, to a disk formed color visual controlled by a MIDI controller.

 

First Iteration

When we was learning how to first use Ableton, we started creating a song using the piano pieces from Vocaloid songs to teach myself the software. After spending two days on it, we realized it sounded quite good and started working towards it as being an iteration. But as we continued, it sounded too much chopped up and we decided to change the way we merged them. We used half the songs and tried again. Then, we realized we didn’t like the effect it produced on the audience and decided to change the genre of song we used.

 

The first take of the sound reactive visuals was a modified version of one of the lessons built into Max. We changed the shapes that were created by the visuals, and we also changed the material that the mesh was made of. This allowed us to create nice looking patterns that reacted to the sound of the music, but it was fairly basic and flat looking.


 

Second Iteration

We took a new approach when it came to cutting these audio clips to remixing them together. We used Hip Hop songs since it was more user friendly than Vocaloid and made more sense, for the other music was a weird mix of classical and hard rock which sounds great with singing; but not too well on its own. We went through each song and had the lyrics next to me, cutting it at the end of a verse or chorus. This would make it very easy to mix it together, but it turns out that Myles doesn’t like this type of music, which is understandable, so we started brainstorming on possible music to use.

 

For the second version of the graphics we chose to work on visuals that were far more advanced and had 3d effects to them. We combined two of Masato Tsutsui’s sound reactive visual effects patches by playing them both in the same window. This allowed us to create visuals that reacted to music on top of a rotating sphere with colored animations that would react to the sound of the audience. While this looked really cool, it was very busy, as well as being complicated enough that every time we ran the patches there would be a new problem.


 

Third/Final Iteration

We wanted to go back to a more classical piece, since when used can produce a beautiful melody that fit together easily and are pure bliss to the ears. We started remixing them together and learnt new skills, like how to fade pieces together. This time, the cuts weren’t as easy as just listening to the end of a verse, but these pieces did have quite definite ends of sections. We liked it the best because the songs were slow and graceful while still providing happiness. It provided a great atmosphere for our audience and sounded easy for the ears. We then decided we wanted to add an interactive part to this song. We had been playing with leap motions at lunch and it struck us that we could use this in our final piece. We hooked it up so that the volume and tempo of the piece could be controlled by moving your hand up and down or back and forth.


For the Third and final iteration of the visuals we chose to go back to working with the original patches we had been modifying. We looked deeper into the settings for this iteration in order to create more complicated and interesting designs than we had the last time we worked with the patch, and we also were able to match certain settings to knobs and faders on a MIDI controller. During a performance we are able to change the size, shape, intricacy, and colors of the graphics to draw the audience into the piece.