Looking Glass Bridge SDK
Render holograms to a Looking Glass display using code!
Last updated
Was this helpful?
Render holograms to a Looking Glass display using code!
Last updated
Was this helpful?
provides a standard interface for using Looking Glass displays. Starting with Bridge v2.4.10, Bridge also installs an SDK that exposes APIs needed to implement support for custom rendering to Looking Glass displays. The Bridge SDK replaces the legacy integration method using . It also provides the updates necessary for rendering to 16โ and 32โ spatial displays. Looking Glass Core apps will continue to work with Bridge but only on displays sold prior to 2024.
For a Looking Glass display to produce the correct output image, a special renderer must use 3D content and the optical calibration parameters. Looking Glass software works with three types of content: , , , and media. The optical calibration parameters are unique to each display. The renderer must use the calibration parameters to select the correct color contribution from the 3D content for a given direction at each point in the display panel surface. This is achieved by applying a transformation that converts an internal quilt representation to match the display lens. We call this the "Looking Glass optical transformation" and it is usually performed using special GPU post-processing shader at run-time. provides more detail.
The Bridge SDK provides APIs that offer the following capabilities:
Accessing the unique device calibration parameters
Rendering 3D scenes using OpenGL, DirectX, and Metal
Playing Quilt and RGBD images and videos using a custom 3D media player
These APIs are available for native applications (using C, C++, and C#) and web applications (using Javascript or Typescript).
If you already have an application that integrates with using then you will need to update your application to support the new 16โ and 32โ spatial displays. See our migration guide for support. Bridge supports integration with web apps, native apps, and for 3D media playback.
Web applications can query the calibration parameters for 3D rendering in a web application using . The web application must use these calibration parameters to implement a post-processing shader as part of its 3D rendering pipeline in WebGL. This method of integration is supported on all platforms with any Looking Glass display using Firefox and Chrome. The Looking Glass WebXR library uses this method. For more information, read .
There are two options for integrating native renderers written in C, C++, or C#. The first option is to query the device calibration parameters and create your own custom 3D rendering pipeline that uses the calibration parameters in a post-processing shader. This method will work on any platform with any Looking Glass display. However, any bugs in the implementation are the implementers responsibility to fix. The uses this method.
The second option is to configure an existing 3D rendering pipeline to apply required post-processing for a Looking Glass display. This method is supported on Windows (using OpenGL and DX12) or MacOS (using OpenGL and Metal). This works by exposing functions to allocate resources in a shared GPU context. Bridge functions must be called to trigger the post-processing render pass. This ensures that the data is efficiently managed on the GPU to maximize rendering performance. The uses this method. provides more detail.
Web and native applications can configure the Bridge media player using a Bridge HTTP REST API to render local images and videos. This method of integration supports Quilt and RGBD images using JPG and PNG files. It supports Quilt and RGBD videos using the MP4 (H264 and H265) and WebM (VP8 and VP9) files. This method of integration is supported on MacOS, Windows, and Linux for every Looking Glass display sold since 2021. uses this method. The section provides more detail.