|martyquinnsMoveMusic.wmv or youtube:||MoveMusic at Morse Hall UNH|
MoveMusic is a new way to perceive movement through music. It uses a 3 stage process. Stage 1 is the visual movement tracking stage which uses web cam visual surveillance tracking to track and label the changing movements in a video stream. Stage 2 is the video image to target image mapping stage which maps the changing movement locations in the realtime video image to similarly located pixels in a target image. Stage 3 is the sonification stage which uses the color and brightness content of the target image pixels to select pitches played by various instruments at various volume levels. The ability to perceiving the direction and location of the movement is derived from 1) knowing the design of the target image and 2) knowing the sonification design, that is, how the pixel data is mapped to musical output.
Target images can be specially designed to produce consistent and coherent changes in music. For instance, one target image might be a color gradient field that goes from dark blue on the left and changes gradually to light blue on the right. If someone moves from the left side to the right side of the video image, and the changing movements are mapped into this target image, then the resulting music will produce a series of low to high pitches played by the 'blue' instrument. The speed of the series will be dependent on how fast the person moves, and the resolution and sensitivity of Stage 1 video tracking.
We envisioin MoveMusic will allow those who cannot see the chance to perceive sports and performance arts of all kinds. Preliminary explorations suggest that it may also have application to learning physical arts. For instance, a Kung Fu teacher sighted or not, could demonstrate the movement resulting in a certain series of musical events. The student then knows by the music, and without needing to see, that the instructor's right arm moved out, and their left leg moved in. They then move and if the music is roughly the same or similar, the student and the teacher can perceive they have performed the same movements.
Multiple perspective web cam tracking can give a complete 3D visual view of the movements, which can then be mapped into music using one MoveMusic process per camera. As long as the target image for each camera is unique, then all three perspectives can be heard simultaneously providing a new perceptual opportunity to perceive 3D movement through the audio, and in particular, the musical cognitive channels of the brain.
We believe MoveMusic is something that can be easily learned. Even without knowing what the sounds mean, one can hear significant patterns of music which reflect the patterns of movement in the event being tracked. We are in close communication with the National Federation of the Blind and have begun to introduce the concept of MoveMusic at the recent 2008 annual convention. It is being used as part of the "Light Runners" program, a national educational and public outreach program sponsored by NASA to present solar space science data through a museum style exhibit we developed called "Walk on the Sun".
"Walk on the Sun" encourages people to walk on a data projected 3 foot Sun, and make music by the movements of their arms, hands and entire body, which are mapped into whatever image is displayed over them and onto the ground (over 2 million images of the Sun are available from 8 cameras on board two spacecraft). Once each person realized what was going on - that their movement was triggering the music which was derived from the image below their feet, they would jump right in and start moving, exploring the image sounds with their body movements. Wheelchair folks rolled over the Sun and used all the parameters at their disposal - speed, spin, and some hand movements to explore the images. The "Light Runners" program will travel to 12 cities during June 26, 2008 thru June 25, 2009. The exhibit uses a visual surveillance tracking software toolkit developed by Efi Merdler and vibration transducers called "Musica Medica" marketed by Giora Loran of Kentano LLC of Nashua, NH. The exhibit was developed originally with the help of a two-year NASA Ideas grant in collaboration with UC Berkeley Space Science Lab and the Christa McAulliffe Planetarium.For additional information on "Walk on the Sun" see the Exhibits link at www.drsrl.com.
The use of MoveMusic has implications for the choice of cameras and the way cameras are positioned and used.. For sighted viewers, the idea of watching a performer in the middle of the screen at nearly all times is pleasant, allowing the viewer to see clearly what is happening. The viewer knows the performer is moving because they can see the background wizzing by. For MoveMusic,since we using visual tracking, we actually want the camera to be stationary in any one shot. The performer will move across the image which is precisely the information needed to provide the sightless with the knowledge that the performer is, in fact, moving in such and such a way across the image. We may want wide angle lens to capture the whole field in some cases.
The design of the target image the camera shot is being mapped into allows for wide artistic and perceptual expression, perhaps featuring items such as out of bounds areas mapped to special sounds, hot zones, incline gradients, etc. There are many design, perceptual and cognitive issues to be explored.
Sonification Research Lab
marty @ drsrl.com