Flex is a generative, multi-channel music performance piece controlled by a Brain-Computer Music Interface (BCMI). Flex investigates the playful side of electronic music performance, toying with ideas of gesture, expression and immersion through harnessing cognitive thought processes and brain wave data classification. Using my bespoke BCMI (using an Emotiv headset and my own software) mapped to Integra Live, a new musical engine developed at Birmingham Conservatoire, brain signals from the performer alone provide the real time control of the music.
Flex prescribes the mind as muscle, using thought to replace the physical embodiment of performance. As a result mappings are key to providing the feel, direction and nature of accuracy within the piece, from the micro up to macro parameters, replacing traditional, physical controllers to manipulate, arrange, synthesise and diffuse combinations of recorded and computer generated sound. Flex has been performed at Sight, Space, Sound and Play in Leicester UK, at the 10th CMMR 2013 in Marseille, France and for Mercy/Syndrome at FACT, Liverpool, UK.
Here’s a video about Flex, made by Nathan Gregg: