Show simple item record

dc.contributor.authorOyekoya O
dc.contributor.authorStone R
dc.contributor.authorSteptoe W
dc.contributor.authorAlkurdi L
dc.contributor.authorKlare S
dc.contributor.authorPeer A
dc.contributor.authorWeyrich T
dc.contributor.authorCohen B
dc.contributor.authorTecchia F
dc.contributor.authorSteed A
dc.contributor.editorThalmann NM
dc.description.abstractIn the BEAMING project we have been extending the scope of collaborative mixed reality to include the representation of users in multiple modalities, including augmented reality, situated displays and robots. A single user (a visitor) uses a high-end virtual reality system (the transporter) to be virtually teleported to a real remote location (the destination). The visitor may be tracked in several ways including emotion and motion capture. We reconstruct the destination and the people within it (the locals). In achieving this scenario, BEAMING has integrated many heterogeneous systems. In this paper, we describe the design and key implementation choices in the Beaming Scene Service (BSS), which allows the various processes to coordinate their behaviour. The core of the system is a light-weight shared object repository that allows loose coupling between processes with very different requirements (e.g. embedded control systems through to mobile apps). The system was also extended to support the notion of presence awareness. We demonstrate two complex applications built with the BSS.en_US
dc.relation19th ACM Symposium on Virtual Reality Software and Technology (VRST 2013) ; Singapore : 6.10.2013 - 9.10.2013
dc.titleSupporting interoperability and presence awareness in collaborative mixed reality environmentsen_US
dc.typeBook chapteren_US
dc.publication.titleProceedings of the 19th ACM Symposium on Virtual Reality Software and Technology

Files in this item


There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record