History of Brainwave Music     Composition      Mapping      Real-Time Notation      Music Therapy    Emotional Control

Mapping is the concept of connecting an input control source to a musical function.

A basic mapping example is the connection of a finger pressing a piano key to the hammer pressing the string. In simple terms, the mapping is Push key -> Play note. An secondary mapping is also defined in terms of pressure. This translates to Push hard -> Loud, Push Soft -> Quiet. All the dynamics in between equate to a linear mapping of pressure to volume.

Designing systems that use the input control from brainwaves to playing electronic music requires mapping design. What brainwave connects to volume? Which one for pitch? Should pitch go up or down? What is the octave range? Mappings hold the key to simplicity, complexity, and creativity when composing music for brain control and for building systems that respond to humans in meaningful and musical ways.

To evaluate mappings using a BCMI we need to break down the connections by evaluating the main components:

BCMI System

The components of a BCMI with a notation based musical engine

Input controls come from brain signals measured using EEG. Electrodes are placed on the scalp to read the electrical activity of the brain. With brain control specific frequencies of the signal are analysed in real-time. A computer program processes the data to extract the meaning within the EEG.

Transformation algorithms are a series of rules that govern how the input controls connect to the music. Transformations algorithms are bespoke elements of code, but can be formatted in environments such as Matlab or XCode, but more usefully (to music makers) in platforms such as Pure Data or MAX/MSP.

A simple rule could define the volume characteristic (like the piano, above) and say:

For low input signals -> Quiet volume

For high input signals -> Loud volume.

More complex rules can provide more sophisticated mappings and go beyond emulating acoustic instruments. With a bespoke digital system effectively any rule can be applied to perform complex musical tasks.

An example of a complex mapping rule is the Timing Rule applied in The Warren, which is used extensively in the piece. The timing rule measures the amount of time brainwaves stay above an amplitude threshold and outputs different commands accordingly. The user has to control their brainwave amplitude at specific times to trigger the desired command. That way a Timing Rule applied to one brainwave can have a number of potential outcomes.

So, we can setup a straightforward one (input) to many (output) mapping, such as.

If time (s) above threshold < 2 seconds then do A

If time (s) above threshold >= 2 seconds and < 4 seconds do B

If time (s) above threshold >= 4 seconds and < 6 seconds do C

If time (s) above threshold >= 6 seconds and < 8 seconds do D

The non-musical commands A, B, C and D are translated by the musical engine. Data from the Transformation Algorithm needs to be sent via MIDI or OSC.

The musical engine receives the rules from the transformation algorithm. It contains no intelligence but passively responds to whichever rule is being followed. Digital Audio Workstations (DAWs) such as Ableton Live, Logic, Reason or MAX/MSP are ideal as they have a vast number of musical parameters than can respond to incoming MIDI or OSC messages.

The musical engine converts the commands from the rules into controls for musical parameters. These parameters can be any of the controls that exist inside the Digital Audio Workstation (DAW), such as turning up reverb, playing the sound of a kick drum, or a combination of many things at once.

Mappings in Practice

The mappings contained in my compositions range in complexity and functionality. One of my aims is to explore the creative mapping potentials of using brainwaves as an input source. Mappings can be combined, interlaced in interesting and crazy waves, be musically intuitive or be completely non-musical in how they operate. They can solve problems far too difficult to solve (but fun to try!), and they can even pose as games to play.

Although not discussed on huge detail here (yet), but some are talked about elsewhere on this site (the game based approach to the mappings in Flex, for example), while others are presented loosely on my blog.

Summary

Mappings are intrinsic to the feel of an interface, in the same way strumming a string make a guitar a guitar. They are traditionally limited to the gestures, control and behaviour of the input signal. I like building hidden mappings into compositions so that things are not always so rigid; musical surprises are fun (and can be quite scary!) for a performer and an audience. Some of my later work explored the effects of mapping emotional values detected in EEG to musical parameters.

Leave a Reply