FOSS XR Conference 2022

US/Central
Schulze Hall ( University of St Thomas Opus Hall)

Schulze Hall

University of St Thomas Opus Hall

46 S 11th St, Minneapolis, MN 55403, USA
Frederic Plourde (Collabora ltd.)
Description

FOSSXR is the International Conference for Free and Open-Source XR and is for everyone interested in VR/AR/MR and Free software. With a strong focus on dozens of XR software projects and freely available drivers and hardwares, the FOSSXR Conference is there to bring the community together and give a podium to the future of XR !

Chairperson
  • Wednesday, 5 October
    • 08:00 09:20
      Coffee: All Morning Drinks and light pastries
    • 08:30 11:50
      Morning session
      • 08:30
        FOSSXR day1 opening session 15m
        Speaker: Frederic Plourde (Collabora ltd.)
      • 08:45
        How to interact with your open source desktop in the far present. 35m

        The goal of this talk is to summarize the recent year of xrdesktop development and showcase capabilities for the upcoming 0.16 release.
        A part of that will be our stand alone VR Wayland compositor wxrd, the virtual 3DUI keyboard with localization support and a focus on the industry standard OpenXR API.
        I will show the ability of xrdesktop to display virtual environments and run on embedded devices like the Raspberry Pi 4 or AMD SoCs.
        Since the 0.16 release will also be the first containing the G3k 3D widget toolkit, using Vulkan for rendering, I will present xrdesktop's way forward, outlining our goals to provide a full XR system shell.

        Speaker: Lubosz Sarnecki (Collabora Ltd.)
      • 09:25
        Open-handed: Tools for great hand tracking 40m

        Hand tracking will be as important for XR as the touchscreen is for mobile. It adds to the illusion of presence and provides the primary user input method. Ultraleap is a proud supporter of the FOSS movement, from providing FOSS plugins for Unity and Unreal to open sourcing the North Star headset. In this session we will cover the latest developments in Ultraleap's hand tracking, including how to use it via OpenXR.

        Speakers: Adam Harwood (Ultraleap) , Rodolphe Houdas (Ultraleap)
      • 10:05
        Break 30m
      • 10:35
        Using the Libre-SOC for building a mobile VR headset 25m

        I'm a contributor to the Libre-SOC project, which aims to build a chip with lots of peripherals. And it's a VPU. And it's a 3D GPU powered by Vulkan so it could run Monado. By contrast mobile HMDs such as the Meta Quest use nonlibre SOCs such as the Qualcomm Snapdragon XR2. A VR headset and its controllers also need tracking, in the case of the Valve Index this is implemented using a microcontroller and an ICE40 FGPA. So first I am going experiment with a libsurvive tracker written using nMigen.

        Speaker: Tobias Platen (Libre-SOC)
      • 11:10
        Starting the Project Northstar Foundation for Open Source XR Hardware 40m

        While Open Source software is generally straightforward to setup without capital, it's much more difficult to do so for hardware. There are a number of limitations that make large scale open source XR hardware challenging, like Minimum Order Quantities, NDA agreements for chip and component definitions, and the upfront cost of manufacturing, testing and shipping the finished products.

        Led by Noah Zerkin & Bryan Chris Brown, the Project NorthStar Foundation was started to help address these issues and provide an organization that is able to design, source, manufacture and distribute open source hardware.

        This talk will take attendees through the journey of how we arrived at the foundation and what our future goals with the project are.

        Speaker: Bryan Brown (Project North Star Foundation)
    • 10:05 10:35
      Demo
      • 10:05
        HT + SLAM demo (Day1 Break1) 30m

        Monado-based demo of a Hand-tracking + SLAM setup running on the northstar

        Speakers: Mateo de Mayo (Collabora) , Moses Turner (Collabora)
      • 10:05
        LÖVR Demo (Day1 Break1) 30m

        LÖVR is an open source Lua framework for creating VR games and applications. This will be a demo of a VR time travel debugger and text editor, built with LÖVR.

        Speaker: Bjorn Swenson (Collabora ltd.)
    • 11:50 13:20
      Lunch break 1h 30m
    • 12:35 13:20
      Demo
      • 12:35
        HT + SLAM demo (Day1 Lunch) 45m

        Monado-based demo of a Hand-tracking + SLAM setup running on the northstar

        Speakers: Mateo de Mayo (Collabora) , Moses Turner (Collabora)
      • 12:35
        LÖVR Demo (Day1 Lunch) 45m

        LÖVR is an open source Lua framework for creating VR games and applications. This will be a demo of a VR time travel debugger and text editor, built with LÖVR.

        Speaker: Bjorn Swenson (Collabora ltd.)
    • 13:20 18:00
      Afternoon session
      • 13:20
        Update on the state of FOSS XR 25m

        This talk will go through the current state of the FOSS XR, covering the general state and focusing on the Monado project.

        Speaker: Jakob Bornecrantz (Collabora)
      • 13:45
        Frame Timing and Pacing in XR 20m

        This talk will go through the lifetime for a OpenXR application frame, then delving deeper into how a OpenXR runtime interacts with the display system. The talk is meant to provide food for thought in terms of designing app frame timing and pacing for windowing system. It will also go through the current APIs for display content on screen.

        Speaker: Jakob Bornecrantz (Collabora)
      • 14:10
        Project North Star: Powered by Community 30m

        Project North Star is an open source AR headset that sits at the center of a community of enthusiasts building and improving on it. This talk interviews a number of members of that community in order to celebrate their work and understand how it affected their lives.

        Speaker: Charlton Rodda (Collabora)
      • 14:45
        Stardust: a better display server for XR 45m

        So far, most people have not been able to use XR for significant periods of time, to replace our phones or computers, or just to allow us to do our work in a more ergonomic and intuitive fashion wherever we are. A massive roadblock is that we have not had a display server that is built around the constraints and freedoms XR brings, while providing better interaction with all the software you already run in 2D. 2D display servers are not perfect either, XR offers opportunities to fix some of the issues they contain. This talk will introduce Stardust XR, a display server that is being developed to make interactions with your computer through XR useful, account for more accessibility needs than existing desktops and display servers, and let you customize in a much more intuitive manner. It supports 2D apps through Wayland, and will support better interaction of 3D applications through OpenXR.

        Speaker: Nova King
      • 15:30
        Break 30m
      • 16:00
        Introducing Monado's Optical Hand Tracking 45m

        As AR and VR have matured, optical hand tracking has emerged as the default human-computer interaction method - it's simple, intuitive, and requires no extra hardware. However, it is complicated to implement and had required proprietary hardware and software. As a part of Monado, the open-source OpenXR runtime, Collabora has developed a fully open-source optical hand tracking pipeline that's suitable for AR/VR interaction. In this talk, Moses Turner will demonstrate use of this pipeline, go over the history of its creation and explain its relevance to the rest of the XR ecosystem.

        Speaker: Moses Turner (Collabora)
      • 16:50
        ILLIXR: Illinois Extended Reality Testbed 45m

        We will present an XR testbed called ILLIXR (Illinois Extended Reality testbed), an open-source end-to-end XR system. ILLIXR supports XR perception, visual, and audio subsystems, consisting of state-of-the-art sensors and components (e.g., visual inertial odometry, scene reconstruction, asynchronous reprojection, and 3D spatial audio encoding and decoding), all orchestrated through a flexible and efficient runtime system. Enabled by Monado, ILLIXR runs XR applications conforming to the OpenXR interface. It runs on Linux PCs and embedded systems (e.g., NVIDIA Jetson), provides the option to offload some components to the cloud (e.g., AWS), and displays images on multiple commercial headsets. It provides extensive telemetry, enabling extensive power, performance, and quality of service measurements and insights on a fully functional XR system.

        ILLIXR has led to a consortium with industrial and academic partners with the goal of democratizing XR research, development, and benchmarking. The consortium aims to establish a reference open-source testbed, a standard benchmarking methodology, and a cross-disciplinary R&D community for XR systems. ILLIXR is already being used in a variety of research projects to enable advances in XR hardware, software, systems, and algorithms, with the goal of improving end-to-end user quality of experience. This research includes designing new hardware accelerators that are codesigned with innovations in software algorithms for XR; codesigning 2.5D and 3D packaging technologies (for sensors, compute, and memory) with architectures and algorithms; techniques for offloading XR components from power constrained wearables to edge and cloud servers; compilation technologies; scheduling and runtime system design; quality metrics; and more.

        Speaker: Sarita Adve (University of Illinois at Urbana-Champaign)
      • 17:40
        LucidGloves: Mom said we have VR Gloves at home 20m

        Is there anything you can do if there's a VR device you've always wanted to try but can't afford, since it's enterprise-only? One option is to just build it yourself.

        This talk explores the viability of DIY alternatives for inaccessible VR technologies, through the story of the LucidGloves project. A project for open-source VR haptic gloves, with finger tracking and force feedback that can be built for just $60 a pair. Once just an idea during quarantine, now a reality in the hands of hundreds and viewed by millions online.

        Learn about the benefits and obstacles to designing your own VR peripheral hardware, and the challenges of distributing DIY VR tech without the resources of a hardware company. Could a "DIY VR revolution" contribute to the push for a more open ecosystem for virtual reality?

        Speaker: Lucas De Bonet (LucidVR)
    • 15:30 16:00
      Demo
      • 15:30
        HT + SLAM demo (Day1 Break2) 30m

        Monado-based demo of a Hand-tracking + SLAM setup running on the northstar

        Speakers: Mateo de Mayo (Collabora) , Moses Turner (Collabora)
      • 15:30
        LÖVR Demo (Day1 Break2) 30m

        LÖVR is an open source Lua framework for creating VR games and applications. This will be a demo of a VR time travel debugger and text editor, built with LÖVR.

        Speaker: Bjorn Swenson (Collabora ltd.)
    • 18:00 19:00
      Demo: Optional Demo Session
    • 18:00 20:00
      Happy Hour: Drinks and light appetizers
  • Thursday, 6 October
    • 08:00 09:00
      Coffee: All Morning Drinks and light pastries
    • 09:00 12:15
      Morning session
      • 09:00
        FOSSXR day2 opening session 15m
        Speaker: Frederic Plourde (Collabora ltd.)
      • 09:20
        Free drivers for Oculus Rift headsets 45m

        Over the past 3 years, I have been working on implementing positional tracking for Oculus Rift headsets. This talk will go through the Rift constellation system, and the differences between the older DK2/CV1 devices and the newer Rift S. It will touch on reverse engineering an unknown headset protocol, and what we understand of talking to Rift devices. Finally, I'll talk about progress on the algorithms for turning the information they provide into useful room-scale tracking, and the limitations of my current approach.

        Speaker: Jan Schmidt (Centricular Ltd)
      • 10:05
        Break 30m
      • 10:35
        Opengloves: getting a hand-le on VR gloves 20m

        Opengloves was previously a Windows only, OpenVR driver for DIY VR gloves. During my GSoC internship working on Monado, I ported the driver to Linux, making it compatible with OpenXR. Combined with hand tracking, the gloves also support force feedback, presenting unique challenges with implementing interfaces to handle new technologies.

        Opengloves only handles controllers; presenting interesting challenges that have perhaps not been seen by other vendors that ship HMDs. While, these vendors can write runtimes for their devices to be compatible with OpenXR, developing an OpenXR runtime just for Opengloves isn't practical for a DIY project for only controllers.

        This talk covers the new ground the project has covered in VR developments and the lessons that we've learned with developing drivers across specifications for VR gloves.

        Speaker: Daniel Willmott (GSoC Intern at Collabora)
      • 11:00
        The state of xrdesktop on SoCs 35m

        xrdesktop aims to provide an XR desktop experience not just on typical "VR ready" PCs but also on smaller systems.
        What does it take to run xrdesktop's wxrd standalone client, or xrdesktop's gnome-shell, or kwin integration on a Nvidia Jetson board? On a Raspberry Pi?

        After an overview over how xrdesktop implements taking windows into XR, this talk will present the solved and still open challenges to running xrdesktop on SoCs and other small systems, from performance considerations to GPU drivers.

        Speaker: Christoph Haag (Collabora)
      • 11:40
        WebXR, what's new since FOSSXR 2019, metaverse and more 35m

        Following FOSSXR 2019 both XR and the Web evolved. The latest buzzword introduced in late 2021 seems like an intangible abstraction. What actually is the metaverse and why does the Web is the perfect place for it to start?

        This talk will clarify what changed since 2019, both technically, software and hardware with a specific on FLOSS efforts, from all teams.

        The goal is to highlight gaps in the ecosystem for anybody who wants to both visit and build the metaverse thanks to FLOSS components. It is possible and it can done while keeping freedom in mind.

        Practically speaking we will touch on the WebXR specifications, current browsers per device and the networking stack allowing independent XR experiences to interconnect, allowing participants to build independently yet still yet users bring their own avatars and more.

        Speaker: Fabien Benetou (European Parliament Innovation lab WebXR consultant)
    • 10:05 10:35
      Demo
      • 10:05
        HT + SLAM demo (Day2 Break1) 30m

        Monado-based demo of a Hand-tracking + SLAM setup running on the northstar

        Speakers: Mateo de Mayo (Collabora) , Moses Turner (Collabora)
      • 10:05
        LÖVR Demo (Day2 Break1) 30m

        LÖVR is an open source Lua framework for creating VR games and applications. This will be a demo of a VR time travel debugger and text editor, built with LÖVR.

        Speaker: Bjorn Swenson (Collabora ltd.)
    • 12:15 13:45
      Lunch break 1h 30m
    • 12:15 13:00
      Panel Discussion: Panel
      • 12:15
        GDPXR : metaverse and personal data, why floss is key to respect user privacy and actual or coming privacy laws 45m

        presentation : context : in Europe, GDPR is in place since 2016, with enforcement since 2018 (cool down period of 2 year)
        EUCJ has issued in 2020 arrest that scrapped the "safe harbor" treaty challenging the use of US tech services in the countries of the union.
        In this context, and in the probable incoming of other privacy protecting laws in the US and elsewhere, the development of VR technologies needs to take data sharing very seriously.
        The use of FLOSS tools is particularly well adapted to this situation, with transparency, auditability and interoperability at the core.

        discussion : how to enforce that data privacy is safeguarded through interoperable platforms and tools? what to do with the "safe harbor" down? how to build "the metaverse" while respecting user privacy?

        Speaker: Olivier Meunier (ozmovr.eu)
    • 13:45 18:30
      Afternoon session
      • 13:45
        StereoKit, an Open Source Mixed Reality Engine 20m

        StereoKit is a lightweight, code-first, open-source library for building Mixed Reality experiences using C#/C++ and OpenXR! It provides easy APIs to address some of the most common and challenging aspects of development including UI, inputs, interactions, physics, shaders/material systems, working with text, asset loading, and much more. StereoKit is designed to work on Windows/Linux/Android, so virtually any device that works with an OpenXR runtime! This talk contains an overview of StereoKit and how to get started writing productive XR apps!

        Speaker: Nick Klingensmith (Microsoft)
      • 14:10
        spatialfree, setting spatial interactions free 20m

        How to open source XR interactions, despite the prevailing pedantic paper paradigm, for the inclusion of more human potential and expression in the world. This will be demonstrated through computing towards an interaction design concept. Proving things out with turing-tested first principles engineering, employing the abstract mechanics of vector and quaternion math and exporting the worthwhile results as a spatial pattern by passing it through a pedantry filter and custom syntax highlighter/inspector. With this technique, we have thus far unlocked 12+ dofs (principle proprioceptive remappings) at dofdev. Repeatedly realizing intuitive interactions for seemingly impractical tasks, such as freely moving, rotating, and scaling an oriel at a distance. By distilling them down into accessible and distributable spatial patterns, they are made available to the open-source community to use, share, and improve upon, regardless of coding language or engine of choice.

        Speaker: Ethan Merchant (dofdev)
      • 14:35
        OpenXR on Android - Source Included 20m

        This year the Khronos OpenXR Working Group has finalized several important parts of the OpenXR ecosystem for Android-based devices, both all-in-one headsets and runtimes that work on already-released phones. There are several parts that make this work, and they are all open source and maintained in a community fashion. In this talk, I will present the basic parts of the OpenXR on Android ecosystem, as well as some of the considerations that went into their design. Finally we'll discuss how you can get started working with it and the relevant open source projects, whether you are interested in the application side, the runtime side, or the parts in between making it possible.

        Speaker: Ryan Pavlik (Collabora)
      • 15:00
        Intuitive Shapes 20m

        Current XR interactions are usually only accessible in specific positions, making them difficult to be used while relaxed and often inaccessible to people with disabilities. This is the case despite 3D tracking technology allowing the creation of new comfortable ways to interface with a system. In this talk, I will explain how we can use tracked body parts to create accessible user interfaces utilizing components or "shapes" we have identified and made freely available on our website. The purpose of this is to save designers time when approaching some basics of 3D interaction and furthermore to help facilitate user interfaces that can be fitted for any available tracked body parts allowing them to be used in any position and for many physical disabilities.

        Speaker: Niko Ryan (dofdev)
      • 15:20
        Break 30m
      • 15:50
        Visual-inertial tracking for Monado 45m

        Tracking the pose of an XR device in space is a core feature of any XR stack. Visual-inertial methods have become very popular in the hardware of recent years. These methods are applied on devices with one or more cameras and an IMU, they have revolutionized VR and AR by making it possible to abandon the usage of external sensors for tracking completely. Unfortunately, the systems used in production for the main hardware and software platforms are closed source. In this talk, we will present the work made on top of Monado, the open-source OpenXR runtime, to build a solid foundation for visual-inertial tracking for XR. The talk will cover the integration effort of different open-source SLAM/VIO systems from academia, like Basalt, into Monado and the devices that can now use it. We will cover the fundamental theoretical and practical problems these kinds of systems face, with a focus on the specific issues the XR domain brings to the table. A broad overview of the open tools, metrics, and datasets developed alongside this effort will be presented as well as our future plans to keep improving and expanding on this.

        Speaker: Mateo de Mayo (Collabora)
      • 16:40
        Godot Engine 4: a completely free XR creation platform 40m

        Godot Engine is one of the fastest growing fully open source (MIT License) game engines. With Godot 4.0 now in beta, a Vulkan-native OpenXR capable ecosystem is finally here, bringing XR content creation tools rivaling the industry incumbents Unreal and Unity.

        "Godot provides a huge set of common tools, so you can just focus on making your game without reinventing the wheel. Godot is completely free and open-source under the very permissive MIT license. No strings attached, no royalties, nothing. Your game is yours, down to the last line of engine code." —godotengine.org

        In this talk, I will cover what Godot Engine is, how the open Khronos glTF 2.0 standard can be used to import content, characters and animations from Blender, Mixamo and other industry standard tooling, how Godot's GDScript language offers seamless scripting, and demo how Godot can be used to create immersive XR content.

        I will also discuss the glTF ecosystem including XR relevant extensions from Khronos, Open Metaverse Interoperability Group, and avatar extensions from the VRM Consortium.

        Finally, I'll explain my mission as part of the V-Sekai community of building a federated and fully F/OSS platform for social XR on top of Godot ecosystem, as well as the long-term vision for Godot in the open source community, following Blender's lead in establishing a foothold in industry to become a leading F/OSS technology choice.

        Speaker: Patrick Horn (Godot)
      • 17:20
        Immersive Visualization with the ParaView Open Source Visualization Tool 40m

        Scientific visualization has long benefited from the use of immersive technologies (XR), going back to when XR hardware was expensive, and usually found only in research labs. The eruption of lower-cost, consumer XR technology benefits not only the gaming community, but has also increased the potential benefits to science and other "serious" pursuits.

        ParaView, a popular open-source scientific visualization tool, has two "plugins" that have been developed to allow a researchers to immersively interact with 3D data using either a consumer headset (HMD), or in a CAVE-style VR system.

        The headset plugin now includes both an OpenXR as well as an OpenVR interface, allowing any modern PC-connected headset to instantly shift from a desktop to an immersive viewpoint. The CAVE plugin likewise generates outputs suitable for walk-in VR experiences, and interfaces with both high-end, and consumer VR position tracking systems. Together, these plugins enable ParaView to be quickly interfaced to almost any XR system, large or small.

        Underneath the hood of ParaView is the VTK API (Visualization Toolkit), which, also open-source, can be directly interfaced to both OpenXR and OpenVR, using only a handful of Python (or C++) programming statements to again shift from desktop to XR.

        The U.S. National Institute of Standards and Technology (NIST) has been collaborating with Kitware Inc. (maintainers of ParaView and VTK) to further expand the immersive functionality of these tools, including the work to interface with the OpenXR Khronos standard. The NIST High Performance Computing and Visualization Group routinely uses both consumer headset as well as CAVE-style VR facilities to provide researchers the ability to naturally interact with their simulation data through immersive technologies.

        The NIST team will present the current state of the ParaView and VTK interfaces to immersive technologies, and discuss our future plans as we continue our collaboration with Kitware Inc. We also plan to bring both HMD and CAVE-style (one-screen) displays to provide live demonstrations of our ongoing efforts.

        Speaker: William Sherman (NIST)
      • 18:00
        FOSSXR closing session 20m
        Speaker: Frederic Plourde (Collabora ltd.)
    • 15:20 15:50
      Demo
      • 15:20
        HT + SLAM demo (Day2 Break2) 30m

        Monado-based demo of a Hand-tracking + SLAM setup running on the northstar

        Speakers: Mateo de Mayo (Collabora) , Moses Turner (Collabora)
      • 15:20
        LÖVR Demo (Day2 Break2) 30m

        LÖVR is an open source Lua framework for creating VR games and applications. This will be a demo of a VR time travel debugger and text editor, built with LÖVR.

        Speaker: Bjorn Swenson (Collabora ltd.)