To help shoppers make informed decisions quickly, this gadget uses dynamic lights and a barcode scanner. For an early-stage prototype I worked with the research team on controlling 16 LEDs embedded in the handlebar using Arduino / Processing. The video shows a later version. For more information see this article by Fast Company
Named after the fabulous Elm programming language, this game challenges two players to a hilarious fight to the death. The goal was to create a fun game that needs no colour, sound or complex shapes. Using the most minimalistic avatars (circles) and environment (a rectangle) imaginable, the game exposes the visceral experience between humans when it comes down to the basics. After a minute, are you still looking at a circle or straight into your opponent's soul?
Researchers at Intel and University College London have installed a network of 15 smart ultrasound sensors across the London Olympic Park area in order to support the study and protection of local wildlife. My role in this ongoing project is to visualise the realtime bat data for ecologists, park management and the general public to easily access and understand using their normal smartphone or computer. For this purpose I have worked closely with biodiversity experts, conducted user studies and developed a series of prototypes. The latest version can be seen at batslive.org . Please be advised that this application is still work in progress and may contain some glitches. Nevertheless, feel free to email me your feedback. Public launch is planned for the end of the month.
Up to 4 players can collaborate on silly sentences together by replacing the noun, verb, etc. Using 4 tablets and a projector, the game was a success with groups of children at Brighton Science Festival 2016. It is playable online and works with tablets, smartphone or any other type of device. Live Sentences was developed during my time as a visiting researcher at ChaTLab University of Sussex.
Enjoy to the entire trilogy on SoundCloud
Speech generation was made easy thanks to the fabulous MARY TTS Java library. Csound was used for everything else.
This application was programmed entirely in Processing. It uses a commercial interactive tabletop to help groups of up to 4 tourists plan an enjoyable day out in Cambridge. It was developed at The Open University and published as a full research paper at the CHI conference in 2011.
When jazz legends Carla Bley and Steve Swallow played at Philharmonie Essen, Germany, in 2009 I had the honour to design live visuals for the event. A custom-size projection screen (8 meters) was commissioned and I brought artists Henrik Lippke and Thamya Rocha into the team. Together we developed a tool (Processing, Pure Data and multiple MIDI controllers) to generate free-floating bubbles that could move individually or in dynamic formations, adapt to the music, change their shapes and colours and leave trails. Following Thamya's artistic direction, Henrik and I performed the visuals along with the music, focusing on very slow-moving systems of colour and light. The video shows the entire concert in fast forward.
Realtime animated fractals surround the audience on a winter evening. Shaped as human figures, roads or trees, they react to the sound of the audience and occasionally morph into each other. Why shouldn't legs be seen as branches or forks in the road? What makes a forest different from a crowd? Is it just a matter of scale and angle? This work playfully reflects on the concept of self-similarity by extending it from the individual (fractal) object to the boundaries between types of objects as well as story elements. The installation took place at an art college surrounded by trees and roads.