Announcement

Collapse
No announcement yet.

Gaming On Wine: The Good & Bad Graphics Drivers

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    Guess what my posts here are intended to achieve :-) . We do get help from outside individuals and companies. The problem with games is that there are just so many games and graphics cards that it is impossible to test, fix and QA all of them. Interest from other companies is mostly focused on productivity applications. This is why we need lots of help from volunteers.

    At CodeWeavers we have some statistics which games our users run. World of Warcraft leads the pack. At less than 1% of total share. The entire thing is a fairly flat distribution. Every customer wants a different game.

    Wrt GLSL vs assembler: GLSL is not the problem. I prefer an excellent GLSL compiler over vendor-specific assembler extensions. But we really need an excellent compiler that goes 100% of the way, not a mediocre one that gets 80% of the use cases right. This 80% vs 100% consideration applies to all other areas of OpenGL, and is the main difference between the Nvidia driver and all others.

    We have very limited flexibility in avoiding corner cases. If a game hits a d3d corner case, it will hit the same corner case in OpenGL. E.g. Diablo III uses a depth texture as texture and depth buffer simultanously. The depth test is on, but depth write is off. This is legal in both d3d and gl. Nvidia and r600g get this right. Fglrx does not. We cannot work around this bug. Yes, we could in theory create a new texture, but this makes the code messy, fixes one game and breaks 5 others. Believe me, we tried.

    Likewise, if a GPU has hardware support for 256 hardware vertex shader constants, the game requires 254 for one of its shaders, and the driver consumes 4 for its private use, then this aint work. 254 + 4 > 256. This affects many drivers for dx9 cards, and is a real pain on OSX. r300g is slightly better here, but only Nvidia gives us all 256 constants it advertises.

    Comment


    • #47
      Originally posted by stefandoesinger View Post
      We have very limited flexibility in avoiding corner cases. If a game hits a d3d corner case, it will hit the same corner case in OpenGL. E.g. Diablo III uses a depth texture as texture and depth buffer simultanously. The depth test is on, but depth write is off. This is legal in both d3d and gl. Nvidia and r600g get this right. Fglrx does not. We cannot work around this bug. Yes, we could in theory create a new texture, but this makes the code messy, fixes one game and breaks 5 others. Believe me, we tried.
      Do you guys at CodeWeavers have direct contact with AMD's fglrx team for bug reports?

      Comment


      • #48
        Originally posted by PsynoKhi0 View Post
        Do you guys at CodeWeavers have direct contact with AMD's fglrx team for bug reports?
        Yes. Usually we file bugs at their inofficial bugzilla and nudge them about it. Personally I prefer to put the effort into fixing r600g though.

        The bug I was talking about here was reported as http://ati.cchtml.com/show_bug.cgi?id=426. I don't know if Matteo has made any further effort to get the bug fixed, but I assume he has. We also have access to their beta drivers.

        With both Nvidia and AMD it requires a bit of luck to get bugs fixed in time. I guess it depend on their internal workload. Apple is really bad here, as I've explained at FOSDEM.

        Comment


        • #49
          Originally posted by stefandoesinger View Post
          We have very limited flexibility in avoiding corner cases. If a game hits a d3d corner case, it will hit the same corner case in OpenGL. E.g. Diablo III uses a depth texture as texture and depth buffer simultanously. The depth test is on, but depth write is off. This is legal in both d3d and gl. Nvidia and r600g get this right. Fglrx does not. We cannot work around this bug. Yes, we could in theory create a new texture, but this makes the code messy, fixes one game and breaks 5 others. Believe me, we tried.
          Having just hit that exact thing myself two weeks ago, it's an undefined thing according to the GL standard, and causes the loss of all early-Z/hi-Z optimizations on my card (hd4k). Changing my code not to do that gave around 1000x speedup, all on r600g.

          I agree it's not Wine's place to do hacks like that, but this one is really not valid GL. Blame Blizzard (or not, since it is legal in DX).

          Comment


          • #50
            Originally posted by stefandoesinger View Post
            We don't need (well-intentioned) directions, we need manpower. We are 4 people working on the d3d code. All of us are paid by CodeWeavers, without that our work wouldn't be possible. We do have some other responsibilities as well(e.g. school in my case), so we're only working part time on the d3d code.

            What you can do to help:
            • If you know your way around C and OpenGL and have a game that doesn't work, try to fix it.

              This can be a tricky task, but we can help you and give you hints. Contact wine-devel@winehq.org if there are any issues.
            • If you're not a developer, but don't mind compiling Wine from git, run your games with the git code and bisect and report any regressions you find.
            • If you're doing the above on top of the open source drivers, use Mesa git as well.
            • We need QA help on OS X.
            • I have an automated performance monitoring setup, but it needs to run on many more systems. If you're willing to help here, please get in contact with us.

            One annoying factoid is that I spend about one fifth of my time maintaining ~12 operating system installations(Windows, Linux, OS X) on five different computers just to have different GPUs and drivers for testing.
            I take you up on the first 3 :-) Or I'll try to. Lot's of learning needed on my side.

            Comment


            • #51
              Originally posted by haplo602 View Post
              I take you up on the first 3 :-) Or I'll try to. Lot's of learning needed on my side.
              Woo

              It's a lot of learning, but it is very interesting. I once started because I wanted to play Empire Earth

              Comment


              • #52
                Originally posted by stefandoesinger View Post
                Woo

                It's a lot of learning, but it is very interesting. I once started because I wanted to play Empire Earth
                same with me ... I am an avid gamer (mostly MMORPGs) with a simple rule, what does not work in Wine I don't play :-)))

                Comment


                • #53
                  Originally posted by stefandoesinger View Post
                  At CodeWeavers we have some statistics which games our users run. World of Warcraft leads the pack. At less than 1% of total share. The entire thing is a fairly flat distribution. Every customer wants a different game.
                  While I can't argue with your statistics, I can say that WoW runs very well on vanilla wine such that I question how many of those people using vanilla wine for WoW aren't counted. I do however encourge people who are asking for x game/app to get fixed to buy crossover and make their voice known.

                  Comment

                  Working...
                  X