Here’s a screen shot of a bit of my upcoming piece, Flex, which I’ll be performing next month at Sight, Sound, Space & Play, in Leicester. It’s a musical game using a Brain-Computer Interface as the controller for the sound. The system selects elements of sound and assigns controls in a quasi random fashion for the user to figure out whilst at the same time composing on the fly. The front end is using Integra Live, but the real-time mappings from the Brain data are being handled by Pure Data, which sets the rules and parameters of the game. 

I’m slowly finding all of the fun bugs inside Integra, which range from randomly dropping audio altogether and dodgy built in MIDI functionality. Still it’s an extremely promising platform, and I can’t wait for the developers to open it up for pd coding integration.

One of the good things about Integra Live is the multichannel file support, and the simplicity of controlling surround panning. As the piece I’m building is quadraphonic, the layout of the controls makes things so much easier, and saves much faffing in pd. Still, the interface is very processor heavy; start adding a lot of modules and things begin to slow down quickly, one way to avoid this is to use one module for common ones (i.e. input and output modules), and confine things to as few blocks as possible. I’ll post a vid/screen cast of the piece sometime soon.

Leave a Reply