In 2015, before submitting my PhD thesis, I spent some time experimenting with basic hardware interfaces which could be recognized and programmed via a mobile Augmented Reality application.
The architecture of this experiment , which can be seen in the video below, included:
- an Arduino, managing two controllers and two LEDs.
- some Processing applications, which could be controlled via the atomic controllers
- a server application, managing the mapping between each controller and the selected application
- an Android smartphone running the AR application used to program the controllers
The idea beyond this experiment was to explore the field of tangible user interfaces. It is a somewhat old experiment and concept, but fun and interesting nonetheless!