mQube - A Mobile Multi-User Mixed Reality Environment
2001 - 2004, Framework Programme: BMBF-VR/AR


www.mqube.de

The objective of the project mqube is the development of a mobile mixed reality environment for multiple users to support collaborative planning processes. Hereby user groups are enabled to manage and solve complex planning and simulation tasks by the use of new technologies in computer visualization and by using familiar cooperation and interaction mechanisms. By aid of augmented reality technologies virtual objects, characters and data is projected in the real working environment of the user group. By combination of real objects with virtual artifacts into tangible units, a new form of user interface is created. This allows transferring common interaction mechanisms with real objects to virtual objects.
The potential of the mqube system is demonstrated on a miniaturized stage (mixed reality stage) for the interactive event and stage planning. Further application fields are the preproduction and planning of mass events, television shows, product presentations, architecture and city planning.

The centre of research of the participants lied in the realization of virtual characters within the mobile mutli-user mixed reality environment.
For this purpose an Animation Engine for the real-time animation of virtual characters was developed. It consists out of two parts, the motion generator which produces the animations and the mesh generator for the geometrical presentation and mesh deformation of the character. Motions are created by help of dynamic motion model, which create animations in real-time by combining an manipulation pre-produced animation clips with clip operators.
The Choreography Editor is responsible for the creation and editing of simple animation sequences of virtual characters. Hereby simple reactive behaviour is modeled by tasks, which are arranged on a time line. These tasks are responsible for the dynamic control of the characters. Beneath the taks, which are responsible for the creation of motion, also so-called Sync Points are realized to assure spatio-temporal constraints. A mechanism for the conflict solution between tasks is implemented.
Virtual characters together with an extensive motion library were created. For the simplification of the creation process of the geometry and the motions of the virtual characters, numerous plugins and software tools were developed.
The results of this project can be used as independent software components in VR / AR systems or game engines. The separation of motion creation and representation of the character enables the use of these components in different scenarios. The animation system may be used as a starting point for further research and development of autonomous characters.