I’ve had a paper accepted at NIME 2014 (New Interfaces for Musical Expression), which is being held at Goldsmiths in London.

It’s about a new performance piece I’m working on, with my friend Weiwei Jin. The idea is that the performance is directed in real-time by the emotions of a performer and the audience, measuring through brainwaves. Imagine if you went to see your favourite artist perform and their whole performance was arranged in response to your emotions? Anyway, we are planning to debut the piece in Berlin in December if all goes to plan. In order to validate the brain wave reading methods I’ve been conducting some experiments in the lab with Dr. Duncan Williams of the BCMI-Midas project. We’re hoping to publish the results of those this year too.

Anyway, here’s the abstract of the paper:

The Space Between Us is a live performance piece for vocals, piano and live electronics using a Brain-Computer Music Interface system. The brainwaves of one performer and one audience member are measured throughout the performance and the system generates a real-time score based on mapping the emotional features extracted from the brain signals. The system not only aims to portray emotional states through music but also to direct and induce emotional states through the real-time generation of the score, highlighting the potential of direct neural-emotional manipulation in live performance. We measure the two emotional descriptors, valence and arousal, within the electroencephalogram (EEG) recordings and map the two-dimensional correlate of averaged windows to musical phrases. These pre-composed phrases contain associated emotional content based on the KTH Performance Rules System (Director Musices). The piece is in three movements, the first two are led by the emotions of each subject respectively, whilst the third movement interpolates the combined response of the performer and audience member. The system not only aims to reflect the individuals’ emotional states but also attempts to induce a shared emotional experience by drawing the two responses together. This work highlights the potential available in affecting neural-emotional manipulation within live performance and demonstrates a new approach to real-time, affectively-driven composition.

brain cap on head from behind