Real Time Indexing for Motion Capture Applications [2010]

In a few words

Hands-free, finger-based interaction with projected VR environments, using optical motion capture.

[videos at page bottom]


Virtual Reality has always made extensive use of real time motion capture, in order to obtain information about human body movements, in real time. The reason is quite simple, even if technical and possibly complex in its many implementations: the more we know about a person, the better immersion experience we can provide. Ideally, this would include knowing and processing everything about every single part of the body, from head to toes, including ears, eyes, expressions, hands and fingertips postures, etc – not to talk about a person’s intentions and emotions.

This project was born from the requirements of a wall projected VR setup, which was using real time optical motion capture to infer the position of reflective markers placed on participants’ fingertips. Once this information is known, it can be used to create compelling natural interaction paradigms, such as picking up virtual objects directly with one’s fingertips.

A recurrent problem in this kind of setup is to keep a coherent reference to each of the tracked markers placed on the body. Markers can easily get covered during tracking, so sometimes their references could get swapped. The purpose of this study was to develop, test and exploit a real time algorithm capable of dealing with such matter, without requiring a priori knowledge about the configuration or number of markers.

Solution and VR testing

The implemented solution includes a main indexing algorithm and a secondary indexing recovering algorithm. The main indexing technique was developed in order to keep the most correct indexing of an arbitrary number of points. The indexing recovery technique adds an indexing correction feature to the main algorithm. The recovery technique has been thought with Virtual Reality applications in mind, but not exclusively. The solution was tested within a VR environment. During the tests, some participants were asked to recreate a number of virtual objects configurations. This was possible by copying, moving and deleting a given number of different objects. These interactions were triggered thanks to the indexing algorithm real time distinction between three tracked fingers: by pinching an object with the thumb and the index fingers, it could be moved; pinching with the thumb and the middle finger a duplicate of the object was created, and then moved and placed anywhere; by inserting the index and middle finger into an object it was destroyed. Tests data analysis gave numeric informations on the algorithm behaviour, and observations made during algorithm development and tests provided useful clues for further developments. The algorithm has been used in subsequent setups, where finger-based interaction was required (Dissonance, vrGrains).


How does this work

The VR environment used to test the algorithm was created using the XVR platform. The stereoscopic content was projected onto a 4.4mx2m Powerwall. Headtracking was performed using an ultrasonic Intersense tracker. A set of 12 Optitrack cameras provided tracking information for the users’ fingertips. Additionally, a Cyberglove device was used as a base reference for real time indexing, during the evaluation of the indexing algorithm.

Videos and Pictures


Point Clouds Indexing in Real Time Motion Capture, Mazzanti, D., Zappi, V., Brogni, A. and Caldwell, D.
Proceedings of the 18th International Conference on Virtual Systems and Multimedia, 2 – 5 September 2012, Milan, Italy.

dario mazzanti, 2021

Up ↑