Collabora Has Been Working On A Fully-Open HTC Vive Driver

Written by Michael Larabel in Valve on 12 October 2016 at 12:58 PM EDT. 6 Comments
While we found out today Valve is finally expected to show a VR Linux demo likely using the HTC Vive, it turns out Collabora has been working on their own Linux VR effort with a focus on a fully open-source driver for the HTC Vive for the OSVR platform.

Collabora is hosting their open-source project as OSVR-Vive-Libre and was described in detail in an email this morning to Phoronix by Collabora's Lubosz Sarnecki.

Below is Lubosz's explanation of their efforts to date:
How we accessed the device initially and tracked the rotation via the IMU (Inertial Measurement Unit) was forked from the experimental htc-vive branch from the OpenHMD project and ported it to modern C++. OSVR-Vive-Libre uses features from C++14.

OpenHMD is limited to internal (IMU) tracking, but our goal is of course to achieve the ability to use the external tracking from Lighthouse and get the controller input as well.

Furthermore we used the "Lighthouse Redox" documentation which provides a reverse engineered incomplete description of the USB reports from the device.

Philipp Zabel's work was of great interest as well, since he was able to extract the internal JSON configuration of the head set and also described information encoded in the light sensor samples. The headset configuration file contains data calibrated in the factory as well as sensor positions.

We made a tool called "vivectl" that can dump IMU and light signals for analysis, even if one does not have the hardware.

An example plot of the sensor positions and normals

I analysed the signal with Python and Matplotlib, where Pekka Paalanen (who is frequently mentioned in Wayland related Phoronix articles) did his analysis with Octave. Pekka implemented an usable classification of the signal and was able to calculate the station view. This is an image of what a base station sees, we can think of it like a camera, even though it is an emitter.

On the plot you can see horizontal and vertical opening angles during collision with the light sensors.

Most recently I was able to port this classification method from Octave to our C++ driver. We now will be able to use the Perspective-n-Point algorithm from OpenCV, which we will use for computer vision, to retrieve the world position of the headset.

The next goal is to define an EKF which uses all sensor data (does sensor fusion) to calculate the headset position and rotation in real time with low latency.

Emmanuel Gil Peyrot is working on the USB side of things on the driver. He most recently implemented a libusb backend for more performance and stability, where we used hidapi like OpenHMD before.
Expect to hear more about the OSVR-Vive-Libre project in the near future and it looks quite exciting for open-source fans wanting to experiment with virtual reality.
Related News
About The Author
Author picture

Michael Larabel is the principal author of and founded the site in 2004 with a focus on enriching the Linux hardware experience. Michael has written more than 20,000 articles covering the state of Linux hardware support, Linux performance, graphics drivers, and other topics. Michael is also the lead developer of the Phoronix Test Suite, Phoromatic, and automated benchmarking software. He can be followed via Twitter, LinkedIn, or contacted via

Popular News This Week