A MoveMusic Demonstrations by Marty Quinn of Design Rhythmics Sonification Research Lab

In this video clip, I am moving in my living room, and testing out a new GigE ethernet based camera that is providing my interactive MoveMusic software with an enhanced framerate, plus the ability to place and control the cameras in a theatre environment with greater flexibility than USB based cameras.

As I move, my movements are being tracked and the center of the red squares around the movements are used to select pixels in the colorful 'target' image, which you see in the middle of the screen, and which was designed for an exhibit at the Children's Museum of New Hampshire in Dover, NH. The pixels are then translated into music using color mapped to one of nine instruments and brightness mapped to pitches out of a 6 octave scale per instrument. The exhibit was based on the movement of flying things and is described in the poster image below.

In this video clip below, I have set the scale set to either a Spanish Gypsy scale or an Ascending Harmonic minor. The scale can be changed in real time amongst 100 or so scales.

martyquinnsMoveMusic.wmv or youtube: MoveMusic at Morse Hall UNH
MoveMusic at Morse Hall UNH

 

This MoveMusic technology will be part of a new opera being developed by the Fort Worth Opera with music composed by Libby Larsen. Some/one of the performers on stage will be triggering additional musical/sonic elements as part of the production and storyline. It is scheduled to premier in 2015. 

The technology evolved out of a NASA grant I received which resulted in an exhibit called “Walk on the Sun”. Through the exhibit, blind explorers could move over image data projected onto the ground and hear the pixel content of images of the Sun from NASA's Stereo space mission. I have additional papers, and other briefs I am happy to share. See www.drsrl.com/exhibits for more information.

Email marty@drsrl.com or call 603 988-7107 to setup a demonstration. 

Bio of Marty Quinn

Marty Quinn is a musician/drummer/percussionist, computer scientist, actor and visual artist. He grew up in theatre, and performed for a number of Broadway national tours on drumset. He also plays the Indian tablas, studied voice at the Voice Studio in Boston and drums with Jim Chapin. He has recorded with many artists including Pat Martino (on Starbright and my composition based on Shakespeare's Twelfth Night, Darius Brubeck, Doah World Music Ensemble (Billboard #7 in the late 80s), and operatic singer Corneliu Montano (singing on my Tides of Venice composition). He  created Design Rhythmics Sonification Research Lab in 1987 and has been developing interactive software since 2007. In addition, he has been developing skills in ballet for the past 4 years with Angela Carter in Dover, NH. 

Marty has received 3 NASA grants, and has recently been investigating the feasibility of Massive Image Sonification.

 

MoveMusic
Developed by Martin Scott Quinn c2012 All Rights Reserved
Design Rhythmics Sonification Research Lab

MoveMusic is a new way to perceive movement through music. It uses a 3 stage process. Stage 1 is the visual movement tracking stage which uses web cam visual surveillance tracking to track and label the changing movements in a video stream. Stage 2 is the video image to target image mapping stage which maps the changing movement locations in the realtime video image to similarly located pixels in a target image. Stage 3 is the sonification stage which uses the color and brightness content of the target image pixels to select pitches played by various instruments at various volume levels. The ability to perceiving the direction and location of the movement is derived from 1) knowing the design of the target image and 2) knowing the sonification design, that is, how the pixel data is mapped to musical output.

Target images can be specially designed to produce consistent and coherent changes in music. For instance, one target image might be a color gradient field that goes from dark blue on the left and changes gradually to light blue on the right. If someone moves from the left side to the right side of the video image, and the changing movements are mapped into this target image, then the resulting music will produce a series of low to high pitches played by the 'blue' instrument. The speed of the series will be dependent on how fast the person moves, and the resolution and sensitivity of Stage 1 video tracking.

We envisioin MoveMusic will allow those who cannot see the chance to perceive sports and performance arts of all kinds. Preliminary explorations suggest that it may also have application to learning physical arts. For instance, a Kung Fu teacher sighted or not, could demonstrate the movement resulting in a certain series of musical events. The student then knows by the music, and without needing to see, that the instructor's right arm moved out, and their left leg moved in. They then move and if the music is roughly the same or similar, the student and the teacher can perceive they have performed the same movements.

Multiple perspective web cam tracking can give a complete 3D visual view of the movements, which can then be mapped into music using one MoveMusic process per camera. As long as the target image for each camera is unique, then all three perspectives can be heard simultaneously providing a new perceptual opportunity to perceive 3D movement through the audio, and in particular, the musical cognitive channels of the brain.

We believe MoveMusic is something that can be easily learned. Even without knowing what the sounds mean, one can hear significant patterns of music which reflect the patterns of movement in the event being tracked. We are in close communication with the National Federation of the Blind and have begun to introduce the concept of MoveMusic at the recent 2008 annual convention. It is being used as part of the "Light Runners" program, a national educational and public outreach program sponsored by NASA to present solar space science data through a museum style exhibit we developed called "Walk on the Sun".

"Walk on the Sun" encourages people to walk on a data projected 3 foot Sun, and make music by the movements of their arms, hands and entire body, which are mapped into whatever image is displayed over them and onto the ground (over 2 million images of the Sun are available from 8 cameras on board two spacecraft). Once each person realized what was going on - that their movement was triggering the music which was derived from the image below their feet, they would jump right in and start moving, exploring the image sounds with their body movements. Wheelchair folks rolled over the Sun and used all the parameters at their disposal - speed, spin, and some hand movements to explore the images. The "Light Runners" program will travel to 12 cities during June 26, 2008 thru June 25, 2009. The exhibit uses a visual surveillance tracking software toolkit developed by Efi Merdler and vibration transducers called "Musica Medica" marketed by Giora Loran of Kentano LLC of Nashua, NH. The exhibit was developed originally with the help of a two-year NASA Ideas grant in collaboration with UC Berkeley Space Science Lab and the Christa McAulliffe Planetarium.For additional information on "Walk on the Sun" see the Exhibits link at www.drsrl.com.

The use of MoveMusic has implications for the choice of cameras and the way cameras are positioned and used.. For sighted viewers, the idea of watching a performer in the middle of the screen at nearly all times is pleasant, allowing the viewer to see clearly what is happening. The viewer knows the performer is moving because they can see the background wizzing by. For MoveMusic,since we using visual tracking, we actually want the camera to be stationary in any one shot. The performer will move across the image which is precisely the information needed to provide the sightless with the knowledge that the performer is, in fact, moving in such and such a way across the image. We may want wide angle lens to capture the whole field in some cases.

The design of the target image the camera shot is being mapped into allows for wide artistic and perceptual expression, perhaps featuring items such as out of bounds areas mapped to special sounds, hot zones, incline gradients, etc. There are many design, perceptual and cognitive issues to be explored.

Design Rhythmics Sonification Research Lab
92 High Rd
Lee, NH, USA 03861

marty @ drsrl.com

c2010