History of Brainwave Music     Composition      Mapping      Real-Time Notation      Music Therapy    Emotional Control

Musicians and composers have been using brainwaves in music for almost the last 50 years, and the possibility of a brain-computer interface for direct communication and control was first seriously investigated in the early 1970s. This period reflected a significant trend towards interdisciplinary practices within the arts influenced by experimental and avant-garde artists of the time and a growing engagement with eastern music and philosophies by those in this field.

Musifying Alpha


In 1965 Alvin Lucier performed a piece for live percussion and brain waves titled Music for Solo Performer. The piece was inspired by Luciers’ experiments, with the physicist Edmond Dewan, into controlling bursts of alpha activity with meditative states. Alpha waves (or rhythms) is the term given to describe brain activity within the range of 8-12Hz, and is commonly associated with relaxed states of attentiveness.

During the performance Lucier amplified his alpha waves, read from two electrodes positioned on his forehead, through a series of loudspeakers. As the frequencies contained in alpha waves are below the threshold of human hearing the loud speakers were coupled with resonant percussive instruments including cymbals, gongs, bass drums and timpani as a way of musifying brainwave activity.


In contrast to Lucier’s desire to communicate the natural frequencies of brain activity through acoustic and tangible sound sources, Richard Teitelbaim, a musician in the electronic ensemble Musica Elettronica Viva (MEV), began to incorporate bio-signals into his electronic compositions using cutting edge modular synthesisers in the 1970s. Taking inspiration from Lucier and new advances in synthesis technology Teitelbaum integrated EEG signals alongside other bio-signals into his pieces, many of which focused on the use of, and conceptualised meditative states of mind. Performed throughout 1967 Spacecraft was Tietelbaum’s first use of amplified EEG activity as a control voltage (CV) signal for a Moog Synthesiser. Here the electrical activities of the brain were electronically sonified in real time, again providing a real-time bio-feedback loop for the performer (Teitelbaum 2006). Although Spacecraft was a wholly improvised composition it provided a foundation for his later uses of brain waves that sought to investigate elements of control and musical interaction.

In Tune, perhaps Tietelbaum’s most popular work, was first performed in Rome, 1967. What stands out in later versions of the piece (referred to by the composer as the expanded version) of the piece is the introduction of a second performer’s EEG within his system. Alongside other bio-signals, including heartbeat and amplified breathe, alpha activity was measured then split into two paths within the modular system comprised of analogue synthesis modules, a mixer and audio effects.

teitelbaum : in-tune
Schematic for Teitelbaum’s In Tune (expanded version)

Further work

Other artists at this time were also experimenting with alpha, such as Finnish artist Erkki Kurenniemi’s instrument Dimi-T, where EEG was used to control the pitch of an oscillator (Ojanen et al. 2007). Manfred Eaton’s ideas for an adaptive bio-feedback instrument presented in his book Bio-Music (Eaton 1971) presented his concept of a musical brain system powered by visual and auditory stimuli.


BioMuse, a hardware and software system developed by Benjamin Knapp and Hugh Lusted in the 1990s, introduced a major departure from this, with the use of real-time digital computing to process EEG data.

BioMuse provided a portable kit for digitally processing bio-signals, but what was ground breaking in its methods was that it was able to convert these signals into MIDI data. Thus creating a MIDI controller based on bodily responses (BioMuse also measured eye movements, muscle movements and sound from a microphone input). This use of the MIDI protocol allowed for an EEG signal to be mapped to the input of MIDI enabled equipment, such as a synthesiser, a drum machine or a sequencer.

A demonstration of BioMuse presented at the New Music Seminar 1990 in New York City, showcased this method of mapping multiple bio-signals to MIDI parameters.

BioMuse formed the basis of the commercially available BioControl Systems and led to the development of BioTools, a software toolkit for interfacing with bio-sensors built in the MAX/MSP software environment. The BioMuse Trio, led by Knapp, has been performing since 2008 integrating acoustic instrumentation and bio-signal processing.

A piece using updated versions of these tools, Music for Sleeping and Waking Minds (2011), is an eight-hour long composition intended for night-time listening. Four performers wearing EEG sensors affect properties of tones using simple direct mappings, in order to project basic changes in their brainwave activity to an audience.