Announcement

Collapse
No announcement yet.

PlayStation VR HMD Working On Linux With SteamVR/Dota 2 Thanks To OpenHMD

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by haagch View Post
    OpenHMD does actually compile for android. I don't really know apps that use it... Allegedly there's a VLC build somewhere...

    I actually got a cheap "cardboard" headset for free on the hackaton last weekend so I'm going to experiment a bit with it. I already discovered that of the open source browsers only chromium supports WebVR on cardboard - and it requires https://play.google.com/store/apps/d...ogle.vr.vrcore. This seems to be a proprietary app. Which means not only is there no WebVR in VR on Linux, you can't even get WebVR on android on an open source VR stack (except for some WebVR libraries like Aframe that come with a "polyfill" fallback that has terrible performance on my phone).
    Maybe I'll try updating https://github.com/domination/gvr-services-emulator and adding an openhmd backend, but it's not going to be easy because I have not found full API spec/doc for com.google.vr.vrcore, not even an in depth description of what it does. It provides sensor data from android to VR applications of course but maybe it even does something with rendering?

    If you mean connecting a cardboard headset to a VR application running on the PC - I don't know of an open source for that, only proprietary windows-only ones like TrinusVR.
    The probem is, VR standards are so new, I frankly understand very small about them...

    I know what is DX, DX12, OpenGL and Vulkan. But there tens of VR standards and somehow they interoperate. Not only that, there are tens of different equipment for them. Occulus, Cardboard, etc. And to make things worse some emulate other, etc.

    My wish is simple, I know there are VR enabled games. Supposedly our fav psychopaths from Croteam made a few games that even run VR on Linux. I would like to see demo of Talos Principle in VR.



    DOTA2 supposedly supports VR and its free. I would love to watch live match of pro teams in VR, something i can do already on game engine.

    Would be nice if someone could write some basic explanation around all those confusing standards anyway.

    Comment


    • #12
      message got unapproved and was one

      Comment


      • #13
        Devs always overcomplicate :P

        Long story short, I want to use cardboard on Linux, instead of 600$ toy - to use mobile to play DOTA2 with face toaster made from my mobile and cardboard. That way I can make educated decision if i want to go full on. Would be awesome to see DOTA2 Pro matches in virtual reality. Or shoot some team mates in Team Fortress or play one of many Croteam's games.

        Also would be nice to get some basic clarification somewhere what all standarts mean.

        Ask any Linux geek about difference between Vulkan, DX 12, OpenGL and Metal and they will respond while asleep. Ask difference between Occulus Rift, OSVR and OpenHMD and boom... Headshot!
        Last edited by dimko; 22 May 2018, 06:58 PM.

        Comment


        • #14
          If you want to use VR on your mobile, first off, figure howto get a low latency video signal from your PC to your mobile. That's half the battle right there. Secondly get the Gyro in the phone to report X/Y/Z/Yaw/Pitch/Roll coordinates to the PC as integers or long values. These just get mapped to the APIs for other headsets or directly mapped to the game engine as input controls. There is a visual warp effect applied to the scene to account for the curvature of the lenses in the headset. You also need to split the program's renderer into left/right views. That's about all there is to basic HMD VR. If you want motion controls you need more stuff. But if you just want flight or racing games. That's pretty much all you need.
          Last edited by DMJC; 22 May 2018, 11:43 PM.

          Comment


          • #15
            Originally posted by DMJC View Post
            If you want to use VR on your mobile, first off, figure howto get a low latency video signal from your PC to your mobile. That's half the battle right there. Secondly get the Gyro in the phone to report X/Y/Z/Yaw/Pitch/Roll coordinates to the PC as integers or long values. These just get mapped to the APIs for other headsets or directly mapped to the game engine as input controls. There is a visual warp effect applied to the scene to account for the curvature of the lenses in the headset. You also need to split the program's renderer into left/right views. That's about all there is to basic HMD VR. If you want motion controls you need more stuff. But if you just want flight or racing games. That's pretty much all you need.
            <rant>
            I am confused, buddy, are you one of the developers? If yes - then what you say makes no sense. I am not a developer(ok, i am learning and making first baby steps), but i I understand correctly VR programs require certain API and API has to be supported by hardware and software and all 3 of those(Hardware + software + API) are not controlled by user, marely chosen what to use, and even then, probably dictated by software.

            Simple running DOTA2 in VR to see pro match on game engine does not require non latency approach. I am fine with 0.2 latency with my face toaster connected to my PC over USB3.(OMG LG G3 I love you, if you continue being so awesome LG has my heart when it comes to Android phones, best investment I have ever made). There would be other games I'd love to see in VR where you dont care about latency. Any 'table top' games, like Civilisation or smth.(too bad they are not on VR)

            </rant>

            Comment


            • #16
              Originally posted by haagch View Post
              Well it's proprietary hardware. The reverse engineering effort is ongoing.

              FYI X-Plane does not support VR on Linux for no particular reason.
              It's funny because according to the latest phoronix benchmark it runs crappy on windows.
              https://www.phoronix.com/scan.php?pa...gpufresh&num=5
              X-Plane should support SteamVR on Linux once they moved to Vulkan.

              Comment


              • #17
                “I am confused, buddy, are you one of the developers? If yes - then what you say makes no sense. I am not a developer”

                dude - what are you taking about? $50 headset? Cardboard? Just an API? Latency not important?

                I can guarantee you that VR is more difficult than that. If you are going to get a game system doesn’t make you get a headache and vomit, you are in serious engineering complexity.

                What’s next? “It’s just a self driving electric car for $2000”?

                Comment


                • #18
                  Originally posted by mike44 View Post
                  X-Plane should support SteamVR on Linux once they moved to Vulkan.
                  Yes, but there really isn't standing anything in their way supporting it with their OpenGL engine. I would bet their code would already run as is on Linux with maybe really minor changes.

                  Comment


                  • #19
                    Originally posted by dimko View Post
                    Simple running DOTA2 in VR to see pro match on game engine does not require non latency approach.
                    Low latency is required in VR headsets because the screen of the VR must "move" AS SOON AS you move the head, or you will feel very uncomfortable and potentially get nausea. This is what OneTimeShot also mentions in his post.

                    Latency has to be very low for this reason. This is one of the things that makes VR hard. Moving the VR "screen" when you move the head.

                    The game is not relevant, it is just a thing of moving the image in the VR screen as soon as you move the head. Even looking at a static image will cause issues if there isn't low latency.

                    With a smartphone it would run like crap because a smartphone-PC communication adds a lot of latency.

                    Cardboard runs inside the phone, so there you don't have much latency. If DOTA2 makes a VR app for Android (that works without a PC), you would be able to use that on your phone. The issue is smartphone-PC communcation, if all is done by the smartphone with no PC there isn't much latency issues.

                    I don't think a smartphone can actually run a DOTA2 VR application at the same quality of the PC, of course. They must lower graphics quality a lot for running in a phone.
                    Last edited by starshipeleven; 23 May 2018, 07:58 AM.

                    Comment


                    • #20
                      Originally posted by dimko View Post
                      Also would be nice to get some basic clarification somewhere what all standarts mean.
                      Ask any Linux geek about difference between Vulkan, DX 12, OpenGL and Metal and they will respond while asleep. Ask difference between Occulus Rift, OSVR and OpenHMD and boom... Headshot!
                      The problem is that there is not a standard for the VR related hardware yet (until OpenXR later this year). If you plug in a mouse and a keyboard, your operating system automatically connects it to the standard way of handling keyboards and mice and they will instantly work with all applications that support the standard, which is all of them on X11/X.org because that's where the relevant APIs are.

                      A VR headset is at its core an input device and an output device.

                      The output is obviously the display.
                      Most have a standard display connector via HDMI or displayport and present as a single display with a resolution of 2160x1200 or so.
                      Unfortunately just extending the desktop to that display and starting a game will not look good, because the display is divided between the eyes (some actually use two separate displays, but have a display controller that handles it as one display) so the left eye sees the left part of the image and the right eye sees the right part of the image. So an application that wants to render a VR view needs to split and double the image, and to create a 3D effect, it needs to render the scene for each eye from a slightly different viewpoint. For this the application already needs some knowledge about the hardware. I believe with the HTC Vive the dimensions for the right and left eye are not actually symmetrical. The application also needs to know the distance between the center of the lenses and other values like distance from screen to lenses etc. to create a proper 3D effect.
                      Lenses - they are needed because you can't focus your eyes on a display a few centimeters in front of your face. The lenses make it so your eyes can "focus" on an infinite distance and relax. (Current hardware is limited to just this one focal level and everything will always look in focus which sometimes looks a bit unnatural. Future hardware will include more sophisticated displays where you can focus on different distances). Unfortunately most practical lens shapes produce a "barrel distortion" effect. An application that renders a VR view needs to know the characteristics of this distortion, so it can render its image "reverse distorted", which will make it look undistorted when viewed through a lens with its distortion.

                      The input is first the HMD itself.
                      Having a static image fill your FOV is quite disorienting and not very comfortable, especially for games. The first simple improvement is an IMU - a hardware sensor that knows where "down" is (thanks to gravity) and can detect rotations. A game that gets input from an IMU can detect when a user looks e.g. "up" and rotate the view accordingly, which already creates some of the illusion of looking around inside a game world. They can also detect acceleration, i.e. when you move "forward", "up", etc. but it's way too unreliable to use as the sole source of a positional tracking.
                      So there are a bunch of different positional tracking systems that detect where exactly in space the HMD is. Oculus uses a standard high performance webcam with an infrared sensor and places IR LEDs on the HMD and uses computer vision algorithms to calculate the position in space (relative to the camera). The HTC Vive has basestations that flood the room with laser sweeps and places IR sensors on the HMD and calculates the position in space based on the tiny time differences between when the sweeps hit one of the sensors. The Windows Mixed Reality HMDs have 2 webcams on the HMD and use computer vision to do "inside out tracking" that is purely camera based. These positional tracking systems are usually not quite as fast as necessary, so they use complex filtering (like "kalman filters") to fuse the data from the IMUs with the data from the positional tracking systems and get an an accurate and mostly jitter-free pose (position + rotation) of the HMD in space.
                      Many HMDs come with controllers. They are usually tracked with similar techniques. Additionally they have buttons, triggers, touchpads, etc. - but usually those buttons are not hooked up to any standard API - a typical game is not even aware that the Vive controller is an input device, and will certainly not receive any input.

                      All of the above - knowledge about HMD specifics like lens separation and distortion, the algorithms to calculate the pose of the HMD and controllers, receiving button input from the controllers - could be done in a game that wants to support VR.
                      But most of the positional tracking algorithms for example are proprietary. (That's why OpenHMD is reverse engineering the hardware and developing an open source framework).
                      Also it is way too complicated to keep track of all the HMDs out there and support them. For example if you want to know the lens parameters of the HTC Vive, you can actually request them over USB from the Vive headset and get some JSON back which you then have to parse and apply your own distortion correction algorithm.

                      That is why VR SDKs are a thing.The basic functionality of a VR SDK is usually:
                      * Receive poses (Quaternion + 3D Vector, or transformation matrix) for each tracked device (HMD, controllers)
                      * Receive button/touchpad/trigger input from the controllers
                      * Receive some basic values of the HMD like resolution, user IPD (distance between eyes, for the 3D effect)
                      * For each rendered frame, submit an undistorted texture/image for each eye, the VR SDK will handle all the advanced rendering magic like "reverse distorting" the image appropriately for the lens, interpolating between frames when the application can't keep up

                      Of course everyone does their own SDK/runtime with their own API. So if you want to develop a VR application you have to choose.
                      If you use the Oculus SDK, you link to libOVR.dll and use *their* functions to do all the stuff a VR SDK does. Of course then your application will only work on Windows because the Oculus SDK is only on windows. Also the Oculus SDK will only give you compatibility with the Oculus Rift HMD, because that's all the Oculus SDK wants to know. (There's a hack called Revive that translates the Oculus API to SteamVR and makes it possible to run games for the Oculus SDK with SteamVR instead).
                      If you use OpenVR/SteamVR, you link to libopenvr_api.so, use *their* functions etc. SteamVR is still a proprietary runtime, but at least they have published an API and documentation how to create plugins that implement support for an HMD for SteamVR. SteamVR-OpenHMD is such a plugin, only instead of adding support for one HMD it dynamically feeds values from whatever OpenHMD supported HMD is currently plugged in to SteamVR. As a more ambitious project you can also take OpenVR's header file and start implementing all their proprietary functionality yourself, e.g. by making use of OpenMD. I actually started this but it doesn't do much yet: https://github.com/ChristophHaag/openvr_api-libre/
                      The OSVR SDK is a similar SDK, but on Linux nobody really uses it. It's maybe useful for its SteamVR-OSVR plugin that can bring support for OSVR/VRPN supported HMDs and controllers to SteamVR, though OpenHMD is hoping to provide all the relevant support now after OSVR is not very active anymore.
                      OpenHMD also has an API an application can use but it's a bit more basic and doesn't include advanced rendering features yet. All it provides is a GLSL shader that does the distortion and an example how to use it. Nevertheless a few applications do use it, for example VLC 4.0 will use it and there's already a godot_openhmd plugin. Also OpenHMD is still struggling with implementing positional tracking and controller support for Oculus Rift CV1 and HTC Vive but it's making progress.

                      There isn't really anything stopping applications from implementing support for multiple VR SDKs and letting the user choose which one to use but of course that's more work that not many have done.
                      There are very few VR games on Linux and most of the ones that are have opted to make use of only OpenVR/SteamVR and require SteamVR to run in VR. Here's a mostly complete list with discussions: https://steamcommunity.com/app/25082...7959064016658/
                      Maybe with godot we will see some more projects support VR with OpenHMD...

                      Once OpenXR is released and a majority of applications have switched to it, the situation will be much better.
                      Here is how OpenXR envisions it: https://www.khronos.org/assets/uploa...7-openxr-2.jpg
                      As an example, today Dota 2 uses the OpenVR/SteamVR API and requires SteamVR to run. Once OpenXR is released, games like Dota 2 will hopefully be updated to use the OpenXR API instead. SteamVR, Oculus SDK, OpenHMD etc. will all implement the OpenXR API and will become interchangeable runtimes. So in the future, an OpenXR enabled Dota 2 could run on an OpenXR enabled SteamVR, on an OpenXR enabled Oculus SDK, on an OpenXR enabled OpenHMD, etc.

                      Comment

                      Working...
                      X