Speaker
Description
Tracking the pose of an XR device in space is a core feature of any XR stack. Visual-inertial methods have become very popular in the hardware of recent years. These methods are applied on devices with one or more cameras and an IMU, they have revolutionized VR and AR by making it possible to abandon the usage of external sensors for tracking completely. Unfortunately, the systems used in production for the main hardware and software platforms are closed source. In this talk, we will present the work made on top of Monado, the open-source OpenXR runtime, to build a solid foundation for visual-inertial tracking for XR. The talk will cover the integration effort of different open-source SLAM/VIO systems from academia, like Basalt, into Monado and the devices that can now use it. We will cover the fundamental theoretical and practical problems these kinds of systems face, with a focus on the specific issues the XR domain brings to the table. A broad overview of the open tools, metrics, and datasets developed alongside this effort will be presented as well as our future plans to keep improving and expanding on this.