IMG_0303

It’s about time for a long overdue blog post (in fact I have a few lined up over the next few days), so this one’s about a new area I’ve been working in over the last few months – measuring affective states in EEG and looking at how these correlate to music.

The BCMI-Midas project is a collaboration between Plymouth Uni and the University of Reading whose focus is to “… develop technology for building innovative intelligent systems that can monitor our affective state, and induce specific affective states through music, automatically and adaptively.”

Inspired by this I was interested to see whether affective states, which are relative to emotional levels, can be employed in music-making BCMI systems. Working with Dr Duncan Williams we conducted an initial test of a system I built that aimed to measure emotional responses in EEG and play back music (with crowd sourced emotional tags) that reflected a listeners mood. The experiment and system, which we called the Affective Jukebox (because it worked in real-time selecting music based on the response to the previous song), proved successful in small trials and was the subject of a poster presentation at this years Joint International Conference on Computer Music and Sound and Music Computing 2014, held at the University of Athens. The full paper can be downloaded here. This technique presents a pretty unique method of neurofeedback where control is harnessed via emotional levels, and in this case music is selected and played to reflect this.

Building on this idea I am currently working on a piece of music that composes on-the-fly based on a user’s emotions. In fact, it measures two people’s emotional patterns in their brainwaves, a performer and an audience member. It uses this information to build a score for a performance whilst the levels are being measured. I built the bones of the system for the piece and composed some initial foundations to demonstrate the idea, and this is the subject of a poster presentation from this years New Interfaces for Musical Expression (NIME) 2014, hosted at Goldsmiths over the summer. The piece, The Space Between Us, is being performed in Berlin this December, and will be the subject of more stuff on this site soon. Anyway, the paper that describes the system and the piece can be downloaded here.

One of the reasons I want to explore emotional levels in brainwaves further is to be able to employ more complex mapping strategies in BCMI design. Currently it’s difficult to get a polyphonic type of control in BCI measuring; that is controlling more than one thing at a time, so more than not one-to-many mappings are the only type of control mappings on offer. Currently I use one active (conscious) control method. With affective response (or AV response for arousal/valence) another layer can be added that isn’t tied to the other, active control. The down side is that AV response is passive, that is a user cannot consciously control their affective state (or can they?!), but even so an interesting neurofeedback loop is created and can influence music based on emotion.

athens view

Stunning Athens