100 robots Vs The Audience

Wed 04 January 2012 by Jim Purbrick

A couple of years ago I had great fun putting together the London Geek Community iPhone OSCestra at Open Hack London and I’ve been controlling Ableton Live with iPhone tapped to my guitar as part of 100 robots for a couple of years now so when @andybudd suggested I do a digital music thing for the Brighton Digital Festival I immediately thought that it would be fun to combine the 2 projects by doing a 100 robots performance with audience participation.

The iPhone OSCestra was effectively a distributed collaborative mixing desk with each person controlling the volume and effect parameters on one channel of a playing back Ableton Live set. For the 100 robots performance I wanted to go further and have the audience actually adding parts to the musical performance, so @toastkid and I added extra drum, bass, synth and sample tracks to the 100 robots live set and filled them full of samples that could be triggered by the audience.

While having the samples adjust in tempo to match each song was relatively simple, transposing them to match the key of each song was more complicated. First I built a custom slice to midi preset which mapped the sample transpose to a macro control and used it to slice all of the samples to MIDI tracks, then mapped all of the transpose controls to a single MIDI controller and added a MIDI track which output the appropriate controller value for each song to a MIDI output which was looped back in to Live to transpose the samples.

The next question was how to avoid the performance turning in to a mush if multiple drum tracks or bass parts were playing concurrently. To avoid this we put dummy clips on the normal 100 robots which muted the normal parts when the audience triggered parts were playing. In some cases we let the audience parts add to the music, in others the audience parts would play instead of the normal tracks.

A final question was how to avoid max and I getting lost when the normal parts we play along to were replaced by unfamiliar samples. To deal with this we set the clip quantization on the audience triggered clips to values longer than the clip length. This meant that even if alternate baselines were constantly being launched, we would still hear the normal bassline for a while at the end of each quantization period, so we would know where we were with the track. To tune these settings we did some fuzz testing with semi random MIDI data to see how much madness we could deal with and still manage to play the songs.

With the tests done it was time to perform with 100 robots and 100s of people at the Brighton Dome and Museum.

With the tests done it was time to perform with 100 robots and 100s of people at the Brighton Dome and Museum.

Many thanks to Steve Liddell for recording the Brighton Museum set, @aral for letting us experiment on his update conference and to everyone who participated and watched. If you’d like to host another performance, please get in touch and if you like the music, please check out the 100 robots blog and consider buying our album from bandcamp.


An Open Source, Guitar Mounted, Multi Touch, Wireless, OSC Interface for Ableton Live

Thu 17 December 2009 by Jim Purbrick

Guitar mounted iPhone controller

(100 robots images by Steve Marshall )

Ever since playing with iPhones as music interfaces with the London Community iPhone OSCestra at Open Hack London in May I’ve been wondering how I could use my iPhone as a controller in my rock/electronic band 100 robots. The 100 robots set …

read more

The London Geek Community iPhone OSCestra

Tue 12 May 2009 by Jim Purbrick

On Friday evening while mulling over potentially interesting hacks to build at Open Hack London I remembered an idea I’d had a while ago: there are now loads of interesting ways to use iphones as music interfaces and the iphone to hacker ratio at hack days tends to be …

read more