Augmented Reality Drag and Drop [2014]

In a few words

Physical drag and drop for virtual objects in Augmented Reality.

Context

Innovative paradigms for mobile Augmented Reality aim to increase the naturalness of interaction with AR objects displayed on a mobile device, and distributed within a real environment. This kind of approach to interaction design is strongly connected with Natural User Interface design. Natural Interfaces interaction paradigms are inspired by human actions which have a meaning in the physical world. This design approach aims to create a user experience which is closer to real world behaviour, resulting in a more intuitive interaction. The design of natural interaction paradigms can involve a Reality-Based Interaction (RBI) approach, which increases the realism of interfaces elements, allowing users to interact more directly with them. RBI mimicks daily practices taking place in the non-digital world, common to the users: gestures are a typical way of triggering interaction in these contexts.

The Interaction Paradigm

The AR drag and drop interaction paradigm is based on the concept of physical drag and drop of virtual objects associated to physical AR targets, performed with the use of a mobile device. The idea for this interaction paradigm is born from a collaboration with Giacinto Barresi, who came up with the concept of physical drag and drop of virtual entities. Through the device camera, users can look at the augmented environment, and at the virtual objects associated to AR targets. These objects can be picked up from their target, and linked to the smartphone with a simple accelerometer-based shake gesture, done with the hand holding the device. Then, by carrying the smartphone with her/him, the user can move the virtual object in space. A second shake gesture performed in the direction of an AR target can drop the object. This results in the objects being associated to the new AR target.

Multiple users can interact simultaneously with the environment, using multiple devices. The association of each object with specific AR targets is constantly updated on every device. Consequently, all users experience the same environment, in which the same objects are associated to the same targets. The idea of multiple users interacting with the same AR environment has been exploited in the Augmented Stage project, to create a shared interactive environment in which multiple users could modify the sonic and visual features of a musical participatory performance.

How does this work

The AR application used to develop and test the AR drag and drop paradigm has been developed using the Unity engine, and then compiled as an Android apk. AR tracking is performed thanks to the Vuforia SDK, which can be easily integrated in Unity. For shared environments with multiple users accessing AR objects, a server application developed in Unity was running on a PC.
The interaction paradigm was used for a behavioural research study, where optical tracking of the users’ wrist was performed thanks to an Optitrack setup consisting of 4 cameras. A DLL loaded within the Unity server was used to access and log Optitrack tracking data.

Publication

Repetitive Drag & Drop of AR Objects: a Pilot Study, Barresi, G., Mazzanti, D., Caldwell, D. and Brogni, A.
Proceedings of 2014 IEEE International Conference on Complex, Intelligent and Software Intensive Systems (CISIS 2014), 2 – 4 July 2014, Birmingham, UK.

dario mazzanti, 2021

Up ↑