Announcement

Collapse
No announcement yet.

Here Is What Happens When Trying To Use Non-NVIDIA Drivers To Play XCOM 2 On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    I'm extremely surprised that the game crashes before anything even appears with the Mesa drivers, ESPECIALLY the Intel ones. Unless the game absolutely requires 4.3+, the drivers are feature complete, if not fast. The Mesa drivers are also known to have much less bugs than, say, Catalyst so something should at least appear.

    Unless, of course, the game is trying to use behavior that NVidia is slack on and allows but Intel/AMD both are strict about, causing instant crashes. Not that we'll ever know for certain, so it's always chocked up to "conspiracy theories" but whatever.

    Comment


    • #22
      Originally posted by chuckula View Post

      Not overly shocked since OpenGL ES is a somewhat simplified version of the standard that has a higher likelihood of being supported by more hardware. For example, modern versions of KDE often default to OpenGL ES for desktop acceleration.

      You also fail to provide any evidence of your accusation that Feral is intentionally trying to sabotage everyone else and only support Nvidia. Hate to break it to you but OpenGL ES sure isn't some proprietary Nvidia thing..
      GLES on desktop is definitely not a standard thing; even though Mesa has offered it for quite a while (which is why some DEs use it), desktop support of it from the proprietary drivers is a very recent thing. I really don't think they coded it against GLES, that just sounds highly suspect.

      Comment


      • #23
        There is nothing different on Nvidia drivers than any other driver, no more extensions or hacks. The thing is that Nvidia uses an extra A.I. that replaces shaders with others reconstructed by the same program, as Nvidia wants them to be and without giving any error report for what was bad in the original shader. So everything just works. Further with this technique, Nvidia blends some known post processing FX on special shader level and not on the entire picture as intended, giving the sense that their GPUs are a lot more powerful than they really are. That is why known testers say that Nvidia will non get any big benefit even if they manage to support "Async compute", that does the same thing for more FX (compute based) and with the developer's responsibility. Nvidia just manages well any garbage produced by 'non well educated' developers.
        Last edited by artivision; 04 February 2016, 05:04 PM.

        Comment


        • #24
          Originally posted by nadro View Post
          WTF? It looks like Feral uses OpenGL ES calls in this game (why they use OGL ES Vertex Array Object extension, when VAO is in OpenGL Core since 3.0)... It looks like this game doesn't use OpenGL, but "NvidiaGL -> OpenGL with a lot of hacks availables only on NV drivers". For me this is a proof that Feral's D3D to OGL wrapper is really bad and I'll don't buy their games until they fix that.

          And another one, if a program just crash instead of a proper exit with dialog message etc. eg. when some OGL feature is unavailable it indicates very bad about their developers and/or QA.
          GL_ARB_vertex_attrib_binding are supported on all tested drivers but might need an mesa override.

          Comment


          • #25
            "Non well educated developers" is still way more accusatory and condescending than necessary. In the Linux world, it's like not going out of your way to support every possible ancient hardware combination and code things perfectly using plain old C to the absolute letter of a standard, no matter how complex or arbitrary it is, is somehow unacceptable. There is barely enough infrastructure in place right now to get these modern high-budget titles released on Linux and I'd like to give them as much encouragement as possible. Sometimes things don't conform perfectly to the standard. Sometimes the standard is unclear. The goal is to make it work.
            Last edited by axfelix; 04 February 2016, 05:08 PM.

            Comment


            • #26
              just skimming trough subjects in steam community forum says problem might be the same as Arkham Knight. way too many people with way too many different problems on all platforms to blame drivers

              Comment


              • #27
                Update: http://www.phoronix.com/scan.php?pag...onSI-G3D-XCOM2
                Michael Larabel
                https://www.michaellarabel.com/

                Comment


                • #28
                  Originally posted by chuckula View Post

                  I think that some people around here don't want to actually see any test results Michael. I think they just want a wall of text rant about how AMD GOOD and everyone else BAD.

                  There are plenty of other websites out there that cater to their needs.
                  Originally posted by Michael
                  What are you talking about? The Radeon driver was where most of the tests happened.
                  I'm making this real easy for you two: On the first page of the article Michael wrote this:
                  Mesa 11.2-devel with LLVM 3.9 (via the Padoka PPA) and the Linux 4.5-rc2 kernel (with AMDGPU PowerPlay enabled) was used for this testing... I don't plan on doing any older Mesa tests, since Mesa Git offers the best OpenGL 4 extension coverage as well as the best performance
                  Which is, for obvious reason, untrue. It also means AMDGPU was used for the entire test (unless I'm missing something here).

                  Comment


                  • #29
                    To be honest, even the 'officially supported' Nvidia implementation is pretty sub-par. Due to a combination of Nvidia being intransigent in their opinions about how RandR 1.2 should be used, and Feral's rushed implementation, I can't even change resolution, no matter what I do - the game hard-resets the resolution to whatever is reported via RandR 1.2, and Nvidia only returns native EDID-reported resolutions (they refuse to provide scaled metamodes via RandR 1.2, even though every other driver does it, because reasons). Even if I change the desktop resolution, I end up seeing a fraction of the game viewport, which will only (and always, it overwrites any changes made to the config by hand) render at the native 4k of my panel, and at 4k the game is completely unplayable from a performance perspective on a 970m.

                    Really don't know what to do here, because I'd like not to miss my refund window if it's not going to be fixed, but I really would like to play this, if it was playable.

                    Comment


                    • #30
                      Originally posted by JLSalvador View Post
                      Crash too with the proprietary NVIDIA v361 980 :'-(

                      Code:
                      [0204/200309:ERROR:gl_context_glx.cc(107)] Couldn't make context current with X drawable.
                      [0204/200309:ERROR:gles2_cmd_decoder.cc(3200)] GLES2DecoderImpl: Context lost during MakeCurrent.
                      [0204/200309:ERROR:gl_bindings_autogen_gl.cc(6831)] Trying to call glGetGraphicsResetStatusARB() without current GL context
                      [0204/200309:ERROR:gl_bindings_autogen_gl.cc(6831)] Trying to call glGetGraphicsResetStatusARB() without current GL context
                      [0204/200309:WARNING:x11_util.cc(1490)] X error received: serial 838, error_code 171 (GLXBadDrawable), request_code 154, minor_code 26 (X_GLXMakeContextCurrent)
                      [0204/200309:ERROR:gl_context_glx.cc(107)] Couldn't make context current with X drawable.
                      [0204/200309:ERROR:gles2_cmd_decoder.cc(3200)] GLES2DecoderImpl: Context lost during MakeCurrent.
                      [0204/200309:ERROR:gl_bindings_autogen_gl.cc(6831)] Trying to call glGetGraphicsResetStatusARB() without current GL context
                      [0204/200309:WARNING:x11_util.cc(1490)] X error received: serial 868, error_code 8 (BadMatch (invalid parameter attributes)), request_code 154, minor_code 26 (X_GLXMakeContextCurrent)
                      [0204/200309:ERROR:gl_bindings_autogen_gl.cc(6522)] Trying to call glDeleteVertexArraysOES() without current GL context
                      [0204/200309:ERROR:gl_bindings_autogen_gl.cc(6512)] Trying to call glDeleteTextures() without current GL context
                      That looks like Chromium debug output, probably just from an embedded webview. Also explains why it is GLES, Chromium uses GLES (or WebGL), to transfer OpenGL commands between their processes.

                      Comment

                      Working...
                      X