Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Machover: 'Technology should respond to human inventions'

Tod Machover
Friday 01 March 1996 00:02 GMT
Comments

Almost exactly 30 years ago, Glenn Gould published an article on the future of music recording (High Fidelity, April 1966) in which he said: "In the best of all possible worlds, art would be unnecessary. Its offer of restorative, placative therapy would go begging a patient. The professional specialisation involved in its making would be presumption... The audience would be the artist and their life would be art."

As someone with many of the usual "professional" music credentials, I was surprised, on rediscovering Gould's article recently, that my own work in music and technology has evolved in exactly this direction. In fact, I now believe that the highest priority for the coming decade or two is to create musical experiences and environments that open doors of expression and creation to anyone, anywhere, anytime. To accomplish this without producing numbing background music - but music that enhances the senses and stimulates the mind - is the real trick. I believe such "active music" could be one of our most powerful tools for discovering the unity and coherence underlying the chaos and complexity of everyday life.

My view of technology has always been that it should respond to human inventions, rather than simulate or replace them, and I started developing hyperinstruments at the MIT Media Lab in 1985 toward this end. The first generation of hyperinstruments was designed for virtuosic professional musicians, such as Yo-Yo Ma. These hyperinstruments measured many nuances of performance expression, using this to enhance and expand the instrument's capabilities. In 1991, we began building hyperinstruments for non-professional music lovers. Our Joystick Music system allows a piece of music to be steered, modified, and shaped by manipulating two video-game joysticks. A Sensor Chair, designed for magicians Penn and Teller, uses an invisible electric field to detect body motion and turn it into sound. Such instruments are easy to learn but difficult to master, with enough depth to make them worth practising and exploring.

We are in the middle of one of our largest projects yet, the Brain Opera, in which the audience - live and via the Internet - will contribute to and perform the piece. A maze of hyperinstruments will let people play with aspects of music (Rhythm Tree, Harmonic Driving, Gesture Wall, Melody Easel, etc) before attending a performance where a complete version of the piece will incorporate the sounds they have created.

The goal is not just to have audience members contribute sounds or spoken text, thoughts, memories or favourite songs, but to prompt reflection on the deeper meaning of each. The attempt is to present an exploration of how our minds turn fragmented experience into coherent views of the world. To achieve this, I am seeking a new kind of balance between the ordered complexity of Bach and the exuberant chaos of Cage. The audience will be in the middle of this, making the artistic experience more palpable and visceral to each active person while underlining the collaborative nature of the project.

The Brain Opera will premiere this summer at the New Lincoln Center Festival in New York and will travel worldwide. It is hard to foresee what we will learn from it, but I predict we will go even further toward the vision expressed by Glenn Gould in 1966. I imagine musical instruments built into our environments - furniture, clothing, walls, hand-held objects - projecting our intentions on to our environment. A concert would not be a special occasion but always around us, enhancing our actions at some moments, providing counterpoint at others.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in