Announcement

Collapse
No announcement yet.

Gaming On Wine: The Good & Bad Graphics Drivers

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Artemis3 View Post
    I play Skyrim with a GTX460 for hours to no end (until i fall asleep) just fine. What you describe might happen if you miss to modify the Skyrim binary to become "Large Address Aware", which you do only once in older versions. This patch was made official at least since Dec 2011 so users using the steam version would not notice.

    It is a long know fact, ATI=problems for wine. You are very lucky to have games work out of the box, while with Nvidia it is rather common. The ati experience in linux is very poor, and i see Intel far more committed to the community so the supporting cause should go there instead. I would never buy a system with ati, but an Ivybridge system next year should probably work good.

    It's been many years and AMD has failed to deliver solid gpu support (even windows users complain), that is a fact. Linux is rather low in their list of priorities, next to "don't care". You are very lucky to find a select model with a select catalyst version with select versions of Xorg, etc. to make ATI work with select games; instead of Nvidia's most things just work out of the box experience.
    He is not lucky. the catalyst drivers amd provides arent that bad as people describe. i have been using amd on linux for longer than i have used nvidia. and the difference of the performance are almost similar now. they werent some years ago. but you have to tweak your xorg on amd to get it working like if it was nvidia. skyrim works great for me too with wine. and so does crysis, and alot other high graphics games.

    Comment


    • #32
      Originally posted by bridgman View Post
      The compiler is one part of the driver. The compilers translate shader source in GLSL or HLSL to GPU hardware ISA, but the driver handles the rest of the work, ie the stream of state change and drawing commmands, getting work to the hardware and getting results back. The compiler is a separate piece of code called by the driver only when it encounters an API command to compile a shader.



      Actually no -- the compiler is not involved with actual drawing operations. The compiler generates code which is run on the shader cores, and that code in turn runs when the driver tells the GPU hardware to (for example) run the vertex shader on each element in an array of vertices, reassemble the vertices into triangles, and then generate pixels from the triangles and run the pixel shader on each pixel in a triangle.



      The front end of shader compilers is pretty much always totally different between D3D and OpenGL. The back end (optimization and code gen) is usually common, with an IR in between. For example, Catalyst uses AMDIL as the common IR, while the open source drivers use TGSI.



      ??? The GPU's hardware doesn't know or care what API is being used. The driver translates API-specific operations into a series of hardware commands, and the compiler translates API-specific shader programs into shader hardware ISA. It's been 100% common from the first programmable GPU as far as I know.


      We say exactly the same. Regardless if the compiler is a D3D or OpenGL, your driver/rasterizer must be unified and use both compiles. So why we need D3D bytecode to OGL translation with your driver? You just rephrase my words wile we say the same thing, and then you go to a different conclusion to justify your self. I didn't write that compilers are part of the synthesizer, i did just the opposite.

      Comment


      • #33
        Originally posted by bridgman View Post
        The compiler is one part of the driver. The compilers translate shader source in GLSL or HLSL to GPU hardware ISA, but the driver handles the rest of the work, ie the stream of state change and drawing commmands, getting work to the hardware and getting results back. The compiler is a separate piece of code called by the driver only when it encounters an API command to compile a shader.



        Actually no -- the compiler is not involved with actual drawing operations. The compiler generates code which is run on the shader cores, and that code in turn runs when the driver tells the GPU hardware to (for example) run the vertex shader on each element in an array of vertices, reassemble the vertices into triangles, and then generate pixels from the triangles and run the pixel shader on each pixel in a triangle.



        The front end of shader compilers is pretty much always totally different between D3D and OpenGL. The back end (optimization and code gen) is usually common, with an IR in between. For example, Catalyst uses AMDIL as the common IR, while the open source drivers use TGSI.



        ??? The GPU's hardware doesn't know or care what API is being used. The driver translates API-specific operations into a series of hardware commands, and the compiler translates API-specific shader programs into shader hardware ISA. It's been 100% common from the first programmable GPU as far as I know.


        And when i say "rasterizer" i mean the drivers graphics synthesizer not the hardware (if you know the definition). So just tweak the linux OpenGL rasterizer to run directly HLSL compilations, or at least to run efficiently the outcome of the translations to OGL.

        Comment


        • #34
          Originally posted by Artemis3 View Post
          I play Skyrim with a GTX460 for hours to no end (until i fall asleep) just fine. What you describe might happen if you miss to modify the Skyrim binary to become "Large Address Aware", which you do only once in older versions. This patch was made official at least since Dec 2011 so users using the steam version would not notice.

          It is a long know fact, ATI=problems for wine. You are very lucky to have games work out of the box, while with Nvidia it is rather common. The ati experience in linux is very poor, and i see Intel far more committed to the community so the supporting cause should go there instead. I would never buy a system with ati, but an Ivybridge system next year should probably work good.

          It's been many years and AMD has failed to deliver solid gpu support (even windows users complain), that is a fact. Linux is rather low in their list of priorities, next to "don't care". You are very lucky to find a select model with a select catalyst version with select versions of Xorg, etc. to make ATI work with select games; instead of Nvidia's most things just work out of the box experience.
          Yeah YMMV of course. If Intel GPU's start becoming powerful enough and cheap enough it would really interest me. I agree I been a ATI (AMD) user on Windows and yeah there was some driver difficulties, mostly sub-par performance. Just so we are clear I am not trying to play favorites. Just sick of people bashing AMD without the good being represented.

          I can tell you something bad about AMD if you wish, I had to upgrade to a beta driver to play TF2.. That kinda pissed me off as well as dropping frame rates in the last few drivers. Overall though I like my AMD card better and will purchase from them again.

          Versions xorg-server 1.13.2-1 catalyst 13.1-1(patched).

          Comment


          • #35
            Originally posted by totex71 View Post
            He is not lucky. the catalyst drivers amd provides arent that bad as people describe. i have been using amd on linux for longer than i have used nvidia. and the difference of the performance are almost similar now. they werent some years ago. but you have to tweak your xorg on amd to get it working like if it was nvidia. skyrim works great for me too with wine. and so does crysis, and alot other high graphics games.

            For me, Tera_online (or any other Unreal3 game) doesn't start on any Radeon2000-6000 wile it plays fine with Geforce200-600. Also RIFT works only with the low quality renderer and 15fps on Radeons wile on Gforces has 30+fps with the hight quality renderer and medium settings (not all things work).

            Comment


            • #36
              Originally posted by TFA
              - Apple's OpenGL stack is (surprisingly) a mess.
              Why be surprised? It's been of worse quality than Mesa for several years now.

              (Quality = stability and doing what was intended; it may support more extensions or have more speed, but those don't really matter if it crashes or has corrupted rendering, do they.)

              Comment


              • #37
                Wine tunes their code for Nvidia and detunes it for AMD. Its been this for years, not at any fault of AMD/ATi mind you just the wine people have a Nvidia hard on.

                Comment


                • #38
                  Or rather that wine devs tend to use edge cases that work on nvidia and nothing else. It's my opinion that the target should be edge cases that work on the majority and not just the one.

                  EDIT: It is a fact that the oss drivers are still incomplete... We'll have to see how much the situation improves in the coming years as the oss drivers continue to develop. Even when the oss drivers mature I still expect this to be a problem though as wine utilizes behavior that is not compliant with standards that only nvidia drivers exhibit.

                  I expect that wine wont change its ways though so the oss drivers will probably eventually have to make something like a "nvidia-querks compatibility workaround-fix"
                  Last edited by duby229; 08 February 2013, 06:36 PM.

                  Comment


                  • #39
                    Wine does not give preferential treatment to Nvidia. Wrt those NV extensions in arb_program_shader.c: Give me an equivalent assembler-style shader API exposed by AMD drivers that provides features equivalent to Shader Model 3 in Direct3D9 and I'll see if I can add a backend for it to wine. (Hint: There is no such API. Thus the limit to Shader Model 2).

                    We do have some AMD specific codepaths as well: see ati_fragment_shader.c for example, or GL_ATI_texture_compression_3dc.

                    Wrt performance: This version of the video of my talk has the slides in them: http://www.youtube.com/watch?v=T4ACXvm2gbc . What I intended to show (I hope I was successful in that) is that both r600g and fglrx run Linux native OpenGL apps(namely Unigine, UT2004, Xonotic, Lightsmark) a lot slower than the Windows driver, whereas the Linux Nvidia driver outperforms its Windows counterpart. Not a single line of Wine code involved. Wine cannot and does not magically fix that.

                    A word on r600g: It is a pretty impressive piece of work, considering that it is mainly developed by volunteers. I did not intend to talk bad about it, but I also did not intend to sugarcoat its shortcomings. My main point wrt. the two AMD drivers is that it's pointless to compare r600g to fglrx. You want to compare r600g to the Windows driver, which is the real benchmark here. (At least on this Evergreen card. Things might be different on SI).

                    Comment


                    • #40
                      Over the years of using Linux on a daily basis, nVidia supports their cards the best overall compared to AMD graphics and Intel. People can say that their AMD graphics cards gives them no problems, but they do not list the model of their card. The AMD graphics Catalyst drivers works better on the latest graphics models. As the versions of Catalyst drivers increases the old cards support and the quality of the drivers for older hardware gets worst. You can argue all you want that I am wrong, but I am not wrong. AMD graphics does not care for the low end graphics as much as they do as their high end graphics. nVidia supports their low end and high end with the same equal degree.

                      Intel graphics stability and reliability is poor. Sure you can hope drivers for Intel graphics gets better, but I would not count on it. Occasionally, Intel graphic drivers glitches. These glitches produces the following error.

                      Code:
                      [drm:i915_hangcheck_hung] *ERROR* Hangcheck timer elapsed... GPU hung
                      In this case, Intel graphics sucks for production.

                      People here need to expect that nVidia graphics is required to game in Linux. Yes, you can use AMD graphics and Intel graphics, but the poor reliability, poor stability, and poor OpenGL support will be your nemesis. It is best to have open source and closed source instead of being restrictive to only open source. Only using open source and only open source, is very restrictive.

                      If I am a developer writing a program to draw in OpenGL, I would use nVidia graphics as the card to use for graphics. WINE developers probably do the same thing. During compile time, WINE does have to use X11 libraries. This means OpenGL is still compiled from MESA.

                      Please do not remind me about ATI graphics with Windows. It is a nightmare to use and I prefer to not remember it. I wish I selected nVidia back in those days. ATI gave me a run around of trying to help me, but always have the same problems. AMD graphics is not any different. AMD graphics requires you to follow up on upgrading to the next high end model to use the latest Catalyst drivers for the best support.

                      Back to WINE:
                      WINE developers needs to find out any areas in the DirectX support that has splotches of holes of non-support. The Sims 2 still does not work or is not able to draw the characters because the support of a instruction is not supported. That really needs to be address. WINE lacks support for multiple DirectX version, so just saying that WINE lacks supports for DirectX 10 sounds really like only that version lacks support. That is wrong. WINE lacks support for multiple DirectX versions. WINE developers needs to make multiple test scripts to test every instruction that each DirectX version supports.

                      Comment

                      Working...
                      X