Ever since playing with iPhones as music interfaces with the London Community iPhone OSCestra at Open Hack London in May I’ve been wondering how I could use my iPhone as a controller in my rock/electronic band 100 robots. The 100 robots set up has Max and I playing live drums and guitar over a number of tracks played from Ableton Live. If I could mount an iPhone on the guitar I could manipulate the tracks playing from Ableton making the whole experience more varied and live and less like 2 people playing over a backing track.
The biggest problem with using the OSCestra set up with 100 robots is that we run Ableton on Windows with 100 robots so that we can use a number of plugins that aren’t available on OS X. This stopped us using the OS X Osculator to convert OSC data from Mrmr in to MIDI data that we could feed to Ableton — the really simple and easy solution that I recommend if you’re using OSC and Ableton on OS X.
Some quick Googling showed that we could potentially use the open source and incredibly powerful pd to read OSC and send MIDI to Ableton, but pd seemed a bit heavy weight and complex to use just for controller mapping in a live environment which needs to be stable and predictable under heavy load while streaming 6 audio tracks and recording 2 tracks live.
Some more research I discovered LiveAPI: an open source project that gives access to the Python interpreter embedded in Ableton Live. LiveAPI allows you to write python code which listens to events from Ableton and allows control of Ableton from python code. Conveniently LiveAPI also includes LiveOSC an OSC server that quickly allows you to map OSC messages from applications like Mrmr to Python methods that use LiveAPI to control Ableton.
In the end it only took a couple of hours and a few lines of Python code to rig up Mrmr on the iPhone to control Ableton on Windows and a few minutes more to build a guitar mount for the iPhone from a cable tie and a piece of gaffa tape.
Using the iPhone live with 100 robots:
(Cinematography by David Packer)
Despite being a really quick hack to build, LiveAPI is somewhat fiddly to set up, so I thought I should document the process of wiring things up. If you’d like to build your own open source, guitar mounted multi touch, osc interface for Ableton Live 8 running on Windows, follow these instructions:
- Install Mrmr on your iPhone or iPod touch
- Design your Mrmr interface using a text editor. If you have a Mac, you can use the interface builder. The 100 robots mmr file is here
- Upload the mmr file to a web server.
- Connect the iPhone to the same network as the computer running Ableton.
- Run Mrmr on the iPhone, download the mmr file from your web server to get your interface running on the iPhone
- Download LiveOSC
- Unpack the LiveOSC zip in to a new LiveOSC folder created inside the Resources/MIDI Remote Scripts directory in your Ableton installation.
Edit Resources/MIDI Remote Scripts/LiveOSCCallbacks.py to add new callbacks for Mrmr to the __init__ method of the LiveOSCCallbacks class. Widgets are numbered sequentially from 0 in the mmr file, so this line registers a callback for the first widget in the file. The 100 robots callbacks look like this:
""" 100robots callbacks """ self.callbackManager.add(self.toggleBass, "/mrmr/pushbutton/0/iPhone1")
Add methods to Resources/MIDI Remote Scripts/LiveOSCCallbacks.py to respond to the messages from Mrmr. These can use LiveAPI freely to control ableton. The 100 robots callbacks toggle the first parameter of effect racks on different tracks to turn them on and off:
def toggleBass(self, msg): """Called when a /mrmr/pushbutton/0/iPhone1 message is received.""" track = 38 device = 0 parameter = 0 value = msg LiveUtils.getSong().visible_tracks[track].devices[device].parameters[parameter].value = value
Select LiveOSC as a control surface in the Ableton preferences. This will load the LiveOSCCallbacks.py and set up your mappings. Interact with the controls in Mrmr and see Ableton Live respond!
The full, modified 100 robots LiveOSCCallbacks.py.