So #pacmf has finally been and gone, and although I didn’t stay for all of it I was heavily involved in proceedings on Friday night through to Saturday. First up was the debut performance of inSight, using software I developed to emulate the hallucinations apparent in the condition Palinopsia. From a stage, an iPad faces an audience and projects the real-time camera input back to the audience. The performer then interacts with the iPad drawing their hallucinations onto the iPad screen in order to communicate the hallucinations as they happen. The audience can then see these hallucinations (over the camera feed of themselves) as they are projected.

Furthermore the interaction with the iPad sends control data (wirelessly) to musical software, again programmed by me, on a laptop musifying (i.e. turning into music) the visual information, according to the setup of this software. Below is a photo of the performance with Alexis on the left hand side interacting with the iPad. On the right is Simon, the flautist who performed to a score and improvisated with the visual hallucinations generated. I might put up the music software on here soon, as well as a video of the app interacting with the music. The next step in developing this is to integrate a controllable (and suitable) soundbed into the app itself using pdlib.

After inSight, I performed with the BCMI (I haven’t got any photos as yet, but you can just about make out the equipment minus me in the middle of the photo above). As I can’t split the video signal of the interface to the projector I have to have a video camera pointing at the screen and take a feed from that to the projector. True to form the camera turned itself off after about a minute into my piece, however in hindsight I think this was a blessing in disguise. I’ve struggled with the concept of just projecting the interface I look at, and after the camera turned itself off the theatre was left in total darkness. Only my face was list by the monitors in front of me and people had to focus on my concentrated gaze instead of rather inhuman interface being projected. In fact someone even commented on it being “…a masterclass in concentration” (which it does actually feel like), and a few people liked how the darkness allowed them to focus on the music. Although the projector cut out I think people had long enough to draw the link between the icons and the musical control, so I’m glad it happened as now I feel confident in taking a whole new approach to the visual aspects of performing with (what I previously considered to be) such a ‘non-performance’ system (i.e. the gestures involved are extremely difficult for the audience to perceive. Perhaps projecting a close up of my eyes, or even moving away from such a direct link of display will be next to explore.

Finally, Saturday saw the iPad Trio performance. Where three performers were stationed in a triangle configuration with the audience inside. The performers then played a ‘piece’ using commercially available iPad apps. I have to say I felt that the surface of musical potential was merely scratched upon, as it was more meanderings of noise as opposed to structured music, and therefore the Trio felt rather disconnected. The most interesting part of the piece was the integration of audience members (with their own iPads). This actually worked really well (although felt rather short-lived), and with greater co-ordination and preparation could have been excellent. Still, it’s pretty rare to see a contemporary music festival attempt some forms of inclusivity for audience members, especially where the playing field is so level.

Leave a Reply