Announcement

Collapse
No announcement yet.

PlayStation VR HMD Working On Linux With SteamVR/Dota 2 Thanks To OpenHMD

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    hoping for Oculus Go support.

    Comment


    • #22
      Originally posted by OneTimeShot View Post
      “I am confused, buddy, are you one of the developers? If yes - then what you say makes no sense. I am not a developer”

      dude - what are you taking about? $50 headset? Cardboard? Just an API? Latency not important?

      I can guarantee you that VR is more difficult than that. If you are going to get a game system doesn’t make you get a headache and vomit, you are in serious engineering complexity.

      What’s next? “It’s just a self driving electric car for $2000”?
      I may have used wrong wording in previous posts a lot. I did not mean to be familiar with anyone, though now i can see why it looked that way. When I said face toaster - I meant mobile in cardboard. I wont feel motion sick from movement not intense games or where I don't actually move.(watching DOTA2 match, playing civilization or smth)

      Windows allows for VR using android + cardbox or similar stuff, with no other investments. Would be nice if we could do it too. That's all. Poor man's VR

      Comment


      • #23
        Originally posted by starshipeleven View Post
        Low latency is required in VR headsets because the screen of the VR must "move" AS SOON AS you move the head, or you will feel very uncomfortable and potentially get nausea. This is what OneTimeShot also mentions in his post.

        Latency has to be very low for this reason. This is one of the things that makes VR hard. Moving the VR "screen" when you move the head.

        The game is not relevant, it is just a thing of moving the image in the VR screen as soon as you move the head. Even looking at a static image will cause issues if there isn't low latency.

        With a smartphone it would run like crap because a smartphone-PC communication adds a lot of latency.

        Cardboard runs inside the phone, so there you don't have much latency. If DOTA2 makes a VR app for Android (that works without a PC), you would be able to use that on your phone. The issue is smartphone-PC communcation, if all is done by the smartphone with no PC there isn't much latency issues.

        I don't think a smartphone can actually run a DOTA2 VR application at the same quality of the PC, of course. They must lower graphics quality a lot for running in a phone.
        Point taken. I can have my head motionless. For me head motion is irrelevant. For me its more relevant to have 3D picture. But it's me...

        I can maintain my head in same position after all

        Comment


        • #24
          Originally posted by haagch View Post
          The problem is that there is not a standard for the VR related hardware yet (until OpenXR later this year). If you plug in a mouse and a keyboard, your operating system automatically connects it to the standard way of handling keyboards and mice and they will instantly work with all applications that support the standard, which is all of them on X11/X.org because that's where the relevant APIs are.

          A VR headset is at its core an input device and an output device.

          The output is obviously the display.
          Most have a standard display connector via HDMI or displayport and present as a single display with a resolution of 2160x1200 or so.
          Unfortunately just extending the desktop to that display and starting a game will not look good, because the display is divided between the eyes (some actually use two separate displays, but have a display controller that handles it as one display) so the left eye sees the left part of the image and the right eye sees the right part of the image. So an application that wants to render a VR view needs to split and double the image, and to create a 3D effect, it needs to render the scene for each eye from a slightly different viewpoint. For this the application already needs some knowledge about the hardware. I believe with the HTC Vive the dimensions for the right and left eye are not actually symmetrical. The application also needs to know the distance between the center of the lenses and other values like distance from screen to lenses etc. to create a proper 3D effect.
          Lenses - they are needed because you can't focus your eyes on a display a few centimeters in front of your face. The lenses make it so your eyes can "focus" on an infinite distance and relax. (Current hardware is limited to just this one focal level and everything will always look in focus which sometimes looks a bit unnatural. Future hardware will include more sophisticated displays where you can focus on different distances). Unfortunately most practical lens shapes produce a "barrel distortion" effect. An application that renders a VR view needs to know the characteristics of this distortion, so it can render its image "reverse distorted", which will make it look undistorted when viewed through a lens with its distortion.

          The input is first the HMD itself.
          Having a static image fill your FOV is quite disorienting and not very comfortable, especially for games. The first simple improvement is an IMU - a hardware sensor that knows where "down" is (thanks to gravity) and can detect rotations. A game that gets input from an IMU can detect when a user looks e.g. "up" and rotate the view accordingly, which already creates some of the illusion of looking around inside a game world. They can also detect acceleration, i.e. when you move "forward", "up", etc. but it's way too unreliable to use as the sole source of a positional tracking.
          So there are a bunch of different positional tracking systems that detect where exactly in space the HMD is. Oculus uses a standard high performance webcam with an infrared sensor and places IR LEDs on the HMD and uses computer vision algorithms to calculate the position in space (relative to the camera). The HTC Vive has basestations that flood the room with laser sweeps and places IR sensors on the HMD and calculates the position in space based on the tiny time differences between when the sweeps hit one of the sensors. The Windows Mixed Reality HMDs have 2 webcams on the HMD and use computer vision to do "inside out tracking" that is purely camera based. These positional tracking systems are usually not quite as fast as necessary, so they use complex filtering (like "kalman filters") to fuse the data from the IMUs with the data from the positional tracking systems and get an an accurate and mostly jitter-free pose (position + rotation) of the HMD in space.
          Many HMDs come with controllers. They are usually tracked with similar techniques. Additionally they have buttons, triggers, touchpads, etc. - but usually those buttons are not hooked up to any standard API - a typical game is not even aware that the Vive controller is an input device, and will certainly not receive any input.

          All of the above - knowledge about HMD specifics like lens separation and distortion, the algorithms to calculate the pose of the HMD and controllers, receiving button input from the controllers - could be done in a game that wants to support VR.
          But most of the positional tracking algorithms for example are proprietary. (That's why OpenHMD is reverse engineering the hardware and developing an open source framework).
          Also it is way too complicated to keep track of all the HMDs out there and support them. For example if you want to know the lens parameters of the HTC Vive, you can actually request them over USB from the Vive headset and get some JSON back which you then have to parse and apply your own distortion correction algorithm.

          That is why VR SDKs are a thing.The basic functionality of a VR SDK is usually:
          * Receive poses (Quaternion + 3D Vector, or transformation matrix) for each tracked device (HMD, controllers)
          * Receive button/touchpad/trigger input from the controllers
          * Receive some basic values of the HMD like resolution, user IPD (distance between eyes, for the 3D effect)
          * For each rendered frame, submit an undistorted texture/image for each eye, the VR SDK will handle all the advanced rendering magic like "reverse distorting" the image appropriately for the lens, interpolating between frames when the application can't keep up

          Of course everyone does their own SDK/runtime with their own API. So if you want to develop a VR application you have to choose.
          If you use the Oculus SDK, you link to libOVR.dll and use *their* functions to do all the stuff a VR SDK does. Of course then your application will only work on Windows because the Oculus SDK is only on windows. Also the Oculus SDK will only give you compatibility with the Oculus Rift HMD, because that's all the Oculus SDK wants to know. (There's a hack called Revive that translates the Oculus API to SteamVR and makes it possible to run games for the Oculus SDK with SteamVR instead).
          If you use OpenVR/SteamVR, you link to libopenvr_api.so, use *their* functions etc. SteamVR is still a proprietary runtime, but at least they have published an API and documentation how to create plugins that implement support for an HMD for SteamVR. SteamVR-OpenHMD is such a plugin, only instead of adding support for one HMD it dynamically feeds values from whatever OpenHMD supported HMD is currently plugged in to SteamVR. As a more ambitious project you can also take OpenVR's header file and start implementing all their proprietary functionality yourself, e.g. by making use of OpenMD. I actually started this but it doesn't do much yet: https://github.com/ChristophHaag/openvr_api-libre/
          The OSVR SDK is a similar SDK, but on Linux nobody really uses it. It's maybe useful for its SteamVR-OSVR plugin that can bring support for OSVR/VRPN supported HMDs and controllers to SteamVR, though OpenHMD is hoping to provide all the relevant support now after OSVR is not very active anymore.
          OpenHMD also has an API an application can use but it's a bit more basic and doesn't include advanced rendering features yet. All it provides is a GLSL shader that does the distortion and an example how to use it. Nevertheless a few applications do use it, for example VLC 4.0 will use it and there's already a godot_openhmd plugin. Also OpenHMD is still struggling with implementing positional tracking and controller support for Oculus Rift CV1 and HTC Vive but it's making progress.

          There isn't really anything stopping applications from implementing support for multiple VR SDKs and letting the user choose which one to use but of course that's more work that not many have done.
          There are very few VR games on Linux and most of the ones that are have opted to make use of only OpenVR/SteamVR and require SteamVR to run in VR. Here's a mostly complete list with discussions: https://steamcommunity.com/app/25082...7959064016658/
          Maybe with godot we will see some more projects support VR with OpenHMD...

          Once OpenXR is released and a majority of applications have switched to it, the situation will be much better.
          Here is how OpenXR envisions it: https://www.khronos.org/assets/uploa...7-openxr-2.jpg
          As an example, today Dota 2 uses the OpenVR/SteamVR API and requires SteamVR to run. Once OpenXR is released, games like Dota 2 will hopefully be updated to use the OpenXR API instead. SteamVR, Oculus SDK, OpenHMD etc. will all implement the OpenXR API and will become interchangeable runtimes. So in the future, an OpenXR enabled Dota 2 could run on an OpenXR enabled SteamVR, on an OpenXR enabled Oculus SDK, on an OpenXR enabled OpenHMD, etc.
          THIS should be it's own article...

          I shall ask Michael to may be publish it as such.

          Comment


          • #25
            Eh, it could be better structured and worded and include some info about game engine VR SDK support. But sure, feel free to take any info and incorporate it into an article.

            Comment


            • #26
              Originally posted by haagch View Post
              Eh, it could be better structured and worded and include some info about game engine VR SDK support. But sure, feel free to take any info and incorporate it into an article.
              Feel like writing an article? I'd say Michael wouldn't mind adding stuff you would put out? Michael, what do you think?

              Comment


              • #27
                Sony have basically proven that 60 FPS is quite usable for VR. The way the PSVR operates is the HDMI decoder box doubles the framerate of games from 60fps to 120fps using a hardware framerate doubler to inject extra frames. Is it the absolute best the technology will ever be? Of course not. But for someone like myself who just wants to play Wing Commander style space games where I'm sitting down with the headset on and I look around a virtual cockpit and shoot at enemy ships. It's quite adequate. I'd just like a higher resolution headset. I've never been interested in motion controls or any of that other garbage. I have a $900 HOTAS joystick setup for input. I don't want some dodgy motion controller. For games where you are sitting and driving a vehicle, head tracking, lens correction, and 60 FPS are all that's required. Vulkan isn't even necessary in most of those use cases. The killer VR app is the Star Wars demo for PSVR, soon to be surpassed by the X-Wing VR remake being made in Unity. What I want to see is Descent 1/2 and Freespace 2 be modified to have VR support. I took at crack at adding VR to Freespace 2 last year, but my programming skills are pretty garbage. I had the renderer split but needed to do perspective correction and lens distortion. The immersion from looking out the cockpit window and seeing your wingman on your wing is intense, and watching enemy ships pop in 3D is awesome fun. That's what I'm interested in with VR.

                http://www.thrustmaster.com/sites/de...uct800x600.jpg
                Last edited by DMJC; 27 May 2018, 06:06 PM.

                Comment


                • #28
                  Originally posted by DMJC View Post
                  What I want to see is Descent 1/2 and Freespace 2 be modified to have VR support.
                  I just had a small taste of vomit in my mouth at the thought.

                  I get it, though. If you're particularly resilient to motion sickness, that would be cool. I used Descent with 3D glasses and a 6-DoF controller, BITD.



                  The seated 3D games that most interest me would be RTS-style warfare games. Like virtual tabletop gaming.

                  Comment


                  • #29
                    Originally posted by haagch View Post
                    Eh, it could be better structured and worded and include some info about game engine VR SDK support.
                    This.

                    I feel like most games probably use an engine (e.g. Unity 3D) to implement VR support, which probably gives the automatic support for multiple HMDs. On Windows, at least...

                    If you want the most sophisticated VR optimizations and the best device support, this is probably the way to go.

                    Comment


                    • #30
                      • godot
                      • unreal engine 4
                      • unity
                      • webvr browsers
                        • Chrome/Chromium: Oculus and OpenVR on windows only
                        • Firefox:
                          • Oculus: windows only
                          • OpenVR: windows and mac
                          • OSVR: long abandoned in an unfinished state but still in the codebase
                      The engine situation is actually halfway decent, but most existing projects first need to update their project to a newer engine version before they can think about trying to make a linux build.

                      For the browsers and WebVR I just don't know, it just looks like they aren't interested in libre platforms.

                      Comment

                      Working...
                      X