Mogees, Making The World Our Musical Instrument…

Musical expression is one of humanities greatest talents, so deeply engrained in who we are that it evolves with us. As soon as new technologies are created we fold it into the ways we express ourselves. The Mogees research project is one such technology, exploring gestural interfaces for musical expressions. Mogees uses a simple contact microphone that feeds audio data in real time to some very complex software to convert any surface into a musical instrument. From a panel of glass to a tree if you can tap it Mogees can convert it into a musical instrument.

The Mogees project has been developed by Bruno Zamborlin, a Ph.D. Student at Goldsmiths Digital Studio – University of London – , in collaboration with Frederic Bevilacqua, Norbert Schnell and the real-time music interaction team at IRCAM – Institut de Recherche et Coordination Acoustique/Musique.

The first demonstration of the technology took place with the live performance at the Beam festival, Brunel University in London June 2011. Since then the technology has continued to evolve. It is currently used in the Airplay project by the IRCAM composer Lorenzo Pagliei.

Mogees at it’s simplest is able to interpret the audio of various gestures – taps, swipes etc. – on any surface and synthesize output sounds. Using the latest audio analysis techniques used by voice recognition systems as well as other leading edge techniques Mogees is able to interpret and synthesize multiple audio streams allowing a musician to conduct an entire orchestra of audio streams.

The contact microphone of Mogees is central to the system, the musician’s inputs into the system are relative to the microphone, a tap further away from the microphone will be softer than a tap right next to the microphone, this information is interpreted by the software to control the audio characteristics such as timbre and duration.

The Mogees musician is able to use multiple gesture types – defining custom gestures as required – that are assigned to the various real-time streams giving the musician the ability to conduct a musical piece with a series gestures. A metallic tap with a thimble on your finger might be assigned to a bell audio stream, with the characteristics of the musicians tap being applied to the incoming bell instruments audio stream. At the same time a tap of the finger would be assigned to a drum. In this way the system could be set up to control many incoming streams as well as pre-sampled audio to allow a single musician to play dozens of instruments in real time.

The simplest synthesis technique of physically modelling the sound using the characteristics of the audio generated by the gestures can be used to create sounds based on these characteristics of the surface. Wood can generate different sound qualities than a mirror for example with Mogees generating the audio directly according to these characteristics.

Concatenative synthesis or audio mosaicing can generate a more approximated output that blends real-time or stored audio streams with the characteristics of the gesture sounds. The audio captured by the microphone is the source audio while the stored audio is the target that will be modified using the source characteristics. The team working on the Mogees project have so far developed two audio mosaic techniques.

A tree instrument

Frame based analysis divides the source and target streams into arbitrary frames, in the same way that 25 frames make up a second of video. This is the most responsive real time way to process audio data and produces a target sample with the characteristics of the musicians input – amplitude, duration, timbre etc. -. This leads to a much more accurate representation of the source information.

Segment based analysis divides the Target and Source into much larger phrases or segments that are not of an arbitrary size but instead are broken up depending on the characteristics of the audio. Processing and output of the Target audio stream is dependent on the segment end being found and then the processing being completed. This process samples both the Source and Target audio streams in segments. With the output audio only very roughly approximating the target stream.

Mogees may be the simplest musical instrument of all times, consisting of just a microphone and a computer that turns the whole world into your musical instrument. Tapping impatiently on a desk can now be a musical experience with Mogees, taking real time audio synthesizers to new and unique places.

Source: IRCAM Real-Time Musical Interactions
Source: Bruno Zamborlin

Author: Buddhas Brother