AvatarKontrol and four:Play [2012]

In a few words

An audio/visual performance based on audience participation performed through personal mobile devices.

Context

Participatory art performances engage their audience by allowing spectators to interact with the piece of work presented by a performer. Spectators may be able to access different aspects of the performance, as individuals or as a whole crowd.

These two performances are founded on the idea that, in musical performances, the audience engagement can be remarkably increased by giving them access to part of the performing stage itself. Valerio Visconti, Victor Zappi, Marco Gaudina and I share this idea, and we collaborated to the design and development of a novel platform for participatory performances.
Thanks to this platform, the audience is empowered to meaningfully influence aspects of a performance, by controlling instruments played by the performers, projected visuals and thus taking a role in the final outcome of the performance, making it truly unique.

The Performances

The core idea of these performances is accessibility: through an open wifi, spectators can use the web browser on their own devices to participate. No specific app has to be installed: participants are automatically redirected to a web page acting as an interface to access the performance. Participants select through the interface a personal name and avatar, which are displayed right away on the projection behind the performers. From then on, participants can use the interface to control their avatar movements and the visuals. In specific moments of the performance, special areas appear in the visuals. These areas grant, to those who reach them, direct access to audio/visual parameters of the performance, for a limited amount of time. During this period, the interface on the participant’s device turns into a slider or a knob, based on the kind of interaction (audio or visual). So, by moving a slider, it can be possible to modify the audio parameters of one of the instruments played on stage or, by rotating a knob, visuals behavior and aspect can be changed.
Also, the audience is invited to dance and move in specific parts of the performances: in these moments, data from their devices accelerometers is used to influence a climax in music and visuals.

Avatar_Kontrol got second in the Make Your Sound competition by Electropark, and was thus staged for two nights at the Electropark 2012 Festival in Genoa, Italy.

four:Play presents an enhanced version of the platform, which includes audience interaction with lights on stage, and a sensor which allowed to project the real-time silhouettes of the performers among the visuals, also accessible for audience interaction. This performance was among the winners of the Call4roBOt 2012 competition, and thus staged at the roBOt Festival 2012, Bologna, Italy.

How does this work

A Linux JavaScript server manages the performance. Two routers and multiple antennas are used, in order to deal with the connection of the audience’s devices and of the performers’ (smartphones and laptops). The server is ran on a laptop. It hosts the web interface used by the audience, and the stage visuals as well. These have been coded using HTML5, Processing.js and three.js .
A second laptop handles the music of the performance from an Ableton Live set. The set exposes instruments parameters thanks to a Max4Live patch, which receives control commands coming from the audience, through the server, as OSC messages. A MIDI device allows Live to control hardware instruments as well. An Arduino board is connected to a serial port, enabling the control of on-stage hardware.
The silhouettes of the performers used by four:Play are acquired using two Kinect sensors, streaming data to the server, and thus to the stage visuals.

Videos and Pictures

https://youtu.be/a_2tphM9fz8

 

 

dario mazzanti, 2021

Up ↑