Musical expression is one of humanities greatest talents, so deeply engrained in who we are that it evolves with us. As soon as new technologies are created we fold it into the ways we express ourselves. The Mogees research project is one such technology, exploring gestural interfaces for musical expressions. Mogees uses a simple contact microphone that feeds audio data in real time to some very complex software to convert any surface into a musical instrument. From a panel of glass to a tree if you can tap it Mogees can convert it into a musical instrument.
The Mogees project has been developed by Bruno Zamborlin, a Ph.D. Student at Goldsmiths Digital Studio – University of London – , in collaboration with Frederic Bevilacqua, Norbert Schnell and the real-time music interaction team at IRCAM – Institut de Recherche et Coordination Acoustique/Musique.
The first demonstration of the technology took place with the live performance at the Beam festival, Brunel University in London June 2011. Since then the technology has continued to evolve. It is currently used in the Airplay project by the IRCAM composer Lorenzo Pagliei.