Our aim is to design a generic framework that allows to turn any object into an interactive surface. The framework has to guarantee the compatibility between multiple surfaces by proving a runtime environment for the dynamic adaptation of an application to a surface. In other words the framework has to make those interactive surfaces compliant to each other so that an application developed for a table could be played on a wall or on a tray, without requiring any adaptation work.
Our main objective is to make it possible to transform such physical objects and surfaces into virtual control interfaces, by using various technologies to track the interaction made by the users, either with the hand or other objects. As a consequence, our research work is focused on the design and development of a framework through which digital applications can be played on physical surfaces of common objects. For instance, a coffee table could be transformed into a game table, a poster or a painting could become a user-interaction-sensitive surface.
So far several technologies (touch screen, RFID – Radio Frequency IDentification, optical, etc.) have been investigated in order to transform everyday objects into interactive surface.The studies showed that each technology has its own advantages and drawbacks and usually it is adapted to carry out specific tasks. For instance, RFID is very useful to identify objects but not to localize them, while acoustic technology is very good to obtain position information but not for identification. Based on this observation, our work aims at developing a framework called Interface, integrating electromagnetic, acoustic and optical technologies in order to augment physical objects towards interactive surfaces. Inter-face framework will support the use of all or a combination of aforementioned technologies on different surfaces, according to the user interaction.
Since different objects could be equipped with different technologies, i.e. the table could be equipped with the three aforementioned technologies while the tray only with the acoustic and electromagnetic ones, we need a way to make it possible to develop applications which are not tightly coupled with a specific interactive surface, but which can be played on different surfaces. The Inter‐face framework addresses this problem by defining the concept of “interaction modalities”. An interaction modality makes an abstraction from the technology used to provide it. The framework defines several interaction modalities: click modality, touch modality, object modality, display modality. They provide object and finger identification, localization and tracking information to the applications by encapsulating underlying technologies. As a consequence every application is defined in terms of needed interaction modalities and can be played on any surface providing such modalities no matter the technology used. Moreover the framework manages also multi‐application and multi‐user issues, making it possible to play simultaneously different applications on the same surface.
Using the Inter-face framework two interactive surfaces have been developed: an interactive table and an interactive tray.The table is a mix of wood for the infrastructure and Plexiglas for the upper surface. Four microphones are fixed on the upper side of the table surface while the RFID antenna (multi-tag reader) is fixed just under the upper surface, along table border. Inside the table there is a beamer, an acoustic kit, four infrared lights and an infrared camera.
The tray is in Plexiglas and it is equipped with RFID and acoustic technologies. Four microphones and the RFID antenna (multi-tag reader) are fixed under the surface.
The interactive table prototype integrates different technologies, which are used to localize and identify objects and interactions:
-
Object identification: RFID (multitag reader). RFID tagged objects are used to interact with the surface (populate the interface with contents, activate functions and personalize the interaction).
-
Object or finger localization: acoustic and optical. Localization is used to select elements within the application (menu items, digital content, etc.).
-
Object or finger tracking: optical and infrared. Tracking is used to interact (move, resize, etc.) with elements within the application.


As next step an accurate evaluation of the prototype will be performed in order to assess how acceptable is for users to use it. The evaluation will be centered on two main aspects: the easiness of use of the interactive table and underlying concepts and the willingness to use such technology. The results of the evaluation will be included in the final paper.
2005 - 1st prototype: rear projection, finger detection from top camera, InteractiveTable / Tangipedia, no framework

2007 – 2nd prototype: rear projection, acoustic touch detection, InterFace framework

2008 – 3rd prototype: rear projection, finger detection from underneath camera, touch gestures, RFID multi-tags detection, InterFace framework

2011 – 4th prototype: rear projection, finger detection from underneath camera with new LED lighting system, touch gestures, RFID multi-tags detection, InterFace2 framework (multi-plugin with integrated physics)

2012 – 5th prototype: 40” LCD TV with PQ Labs Multi-Touch G3D 46", InterFace2 framework (multi-plugin with integrated physics)
