One could safely assume that smartphones, tablets, and prospectively smart glasses are drivers for the future of spatial computing. But the more mobile the hardware, the less computing power it has. That is also a simple way to describe a well-known problem with immersive technologies as it comes to dealing with large amounts of data. Therefore, outsourcing the rendering process through XR streaming can be a game changer for the XR community. The Remote Rendering SDK, for example, enables the streaming of entire AR or VR applications via powerful local servers or the cloud. A device-agnostic approach through deployed client applications also reduces the development effort for XR applications. New applications can be developed – using this easy to implement tool – without limits and restrictions of individual end devices by just building a server application. Time-consuming data simplification is no longer necessary. Streaming complete applications also increases data security. As soon as our Remote Rendering solution comes into play, the data is merely streamed from a chosen server and not stored on the mobile device.
The goal within the ARtwin project was to develop a remote application rendering framework, which enables the streaming of XR applications from a computer with a powerful GPU to performance limited XR devices, such as the HoloLens 2, or even some modern Mobile phones.
The result of extensive research and development is now an easy-to-use Unity3D plugin, which is based on Unity’s XR Plugin Management system. There are preconfigured profiles for Mixed-Reality Toolkit and the ARFoundation Framework. There is a fully functional HoloLens 2 Client with 6DoF and articulated hands, as well as a stable Android and iOS Client with single touch input. There is also a full Audio support and image tracking. To increase performance and general object stabilization, a great effort was put into developing the reprojection depth buffer and alpha channel.
The Remote Rendering service is a three-component solution.
- The “Rendering Server” component: an XR application runtime generates a stream of “rendered” views, that are continuously sent to the client component.
- The “XR Client” component: running on an XR device (e.g. HoloLens 2), which receives the rendered stream of images and simultaneously sends its pose and other input data back to the server.
- The “Signalling Server” component: running on customer specific infrastructure and handles the establishment of a connection between the Render Server component with the XR Client component.
This and more resulted in a highly performant solution that empowers all kinds of XR solutions and enables the expansion into uncharted territory.