Cross reality interaction tools
The main goal of this research workpackage is to define and create a set of appropriate tools and services supporting the individual showcases in the creation and use of rich multi-modal mixed-reality user interfaces.
Due to the lack of standardization of cross-reality user interfaces and interaction techniques, creation, prototyping and evaluation of user interfaces for cross-reality applications has to be much simpler than for other application areas.
In order to address these problems in a threefold approach, tools and services have been developed in the following areas:
The authoring and orchestration tools allow supporting the individual showcases in the preparation, monitoring, orchestration and evaluation of a showcase event. Arbitrary maps, e.g. satellite images or real maps, can be augmented with different information. This information can be the user positions, actions and paths as well as events or special locations.
Additionally, high level interaction techniques are developed for specific scenarios. While these may be very specialized they will hopefully foster the development of more general interaction techniques, which will be useful in other showcases as well.
Further, universal cross-platform access to individual devices is essential since if applications cannot rely on the same set of input and output devices for each scenario. Hence devices with similar functionality should be easily exchangeable and let applications easily support a wide range of possible input devices with no extra implementation effort. As such, several data and event distribution interfaces have been developed.