To help shoppers make informed decisions quickly, this gadget uses dynamic lights and a barcode scanner. For an early-stage prototype I worked with the research team on controlling 16 LEDs embedded in the handlebar using Arduino / Processing. The video shows a later version. For more information see this article by Fast Company
Named after the fabulous Elm programming language, this game challenges two players to a hilarious fight to the death. The goal was to create a fun game that needs no colour, sound or complex shapes. Using the most minimalistic avatars (circles) and environment (a rectangle) imaginable, the game exposes the visceral experience between humans when it comes down to the basics. After a minute, are you still looking at a circle or straight into your opponent's soul?
batslondon.com is a mobile website I developed to make wildlife data available to the general public. It is part of the Nature-Smart Cities project (University College London and Intel). More info in this BBC article
This game allows up to 4 children to collaborate on a sentence together by replacing the noun, verb, and so on. Live Sentence was developed during my time as a visiting researcher at ChaTLab University of Sussex. It was performed at the Brighton Science Festival 2016.
A study in information density made with Csound and the fabulous
speech generation library.
This is a stereo version of the original 4channel piece.
This application was programmed entirely in Processing. It uses a commercial interactive tabletop to help groups of up to 4 tourists plan an enjoyable day out in Cambridge. It was developed at The Open University and published as a full research paper at the CHI conference in 2011.
When jazz legends Carla Bley and Steve Swallow played at Philharmonie Essen, Germany, in 2009 I had the honour to design live visuals for the event. A custom-size projection screen (8 meters) was commissioned and I brought artists Henrik Lippke and Thamya Rocha into the team. Together we developed a tool (Processing, Pure Data and multiple MIDI controllers) to generate free-floating bubbles that could move individually or in dynamic formations, adapt to the music, change their shapes and colours and leave trails. Following Thamya's artistic direction, Henrik and I performed the visuals along with the music, focusing on very slow-moving systems of colour and light. The video shows the entire concert in fast forward.
Realtime animated fractals surround the audience on a winter evening. Shaped as human figures, roads or trees, they react to the sound of the audience and occasionally morph into each other. Why shouldn't legs be seen as branches or forks in the road? What makes a forest different from a crowd? Is it just a matter of scale and angle? This work playfully reflects on the concept of self-similarity by extending it from the individual (fractal) object to the boundaries between types of objects as well as story elements. The installation took place at an art college surrounded by trees and roads.