In a few words
Musical live performances exploiting Virtual Reality technologies, defining the concept of Hybrid Reality Performances.
The use of VR in artistic expression have always intrigued new media artists, pushing them to explore the infinite possibilities of human-coded immersive digital worlds: unconventional rules, perceptions and unexplored interaction paradigms, far from our physical world, are possible in VR.
I had the luck and opportunity to contribute to the development of Victor Zappi‘s platform for Hybrid Reality performances. The platform exploits VR technologies to allow performers and virtual objects to share the same real stage, giving life to compelling audio/visual coreographies.
Two performances have been staged exploiting the Hybrid Reality Platform. I collaborated to the technical development of both, and to the audio/visual design and performing aspects of Dissonance.
Spectators wearing 3D glasses perceive the depth of the virtual elements populating the stage, which to their eyes literally overlap with the real world.
This Hybrid Reality performance is a collaboration between Victor Zappi and the electronic musician Useless Idea, who presented a set of 5 tracks created for this performance, and directly contributed to the design of the stage interactions and audio/visual content. Each track has its own visual theme and interaction rules. In this performance, motion capture allowed both the performer and the audience to interact with 3D virtual elements of the stage, affecting the audio/visual outcome of the piece. The performance was staged for 3 nights inside the VR room of IIT’s ADVR department.
Dissonance is a Hybrid Reality performance consisting of a progressive soundtrack, which is created alongside the exploration of an interactive Virtual Environment. Music generated by real instruments animates the projected VR world. The two performers (Victor Zappi and I) interact with virtual objects, modifying their position, color and shape to control and create sounds and music. The journey through the Virtual Environment introduces new elements and paradigms, while the performers explore the laws driving the sounds and visuals of each scenario. Dissonance was staged at the University of Oslo, as part of the 2011 NIME conference concerts program.
How does this work
The interactive Virtual Environments of both performances have been designed and developed using the XVR platform. XVR allows the integration of OpenGL shaders, which have been used to create the visual effects and coreographies of the virtual stages.
The OSC and MIDI protocols were used to enable communication between the XVR enviroment and Ableton Live, which was chosen to drive the performances audio content (musical instruments and audio sequences). Both Live and XVR ran on the same machine, which also took care of projecting the 3D content in front of the audience. Controllers and instruments were connected to this machine as well.
Virtual_Real, being staged in the fully equipped VR room of IIT’s ADVR department, could use a set of 12 Optitrack motion tracking cameras, in order to capture the performer’s and the audience’s movements. This allowed them to interact with the stage virtual elements.
On the other hand, Dissonance made use of a specifically designed portable VR setup, including a foldable projection screen and a small 3D projector. The tracking system used for the performance presentation at NIME, was an IR system provided by the venue organization.
Design and Evaluation of a Hybrid Reality Performance, Zappi, V., Mazzanti, D., Brogni, A. and Caldwell, D.
Proceedings of the International Conference on New Interfaces for Musical Expression, 30 May – 1 June 2011, Oslo, Norway.
Also related to
Point Clouds Indexing in Real Time Motion Capture, Mazzanti, D., Zappi, V., Brogni, A. and Caldwell, D.
Proceedings of the 18th International Conference on Virtual Systems and Multimedia, 2 – 5 September 2012, Milan, Italy.