Announcement

Collapse
No announcement yet.

Wine's Big Command Stream D3D Patch-Set Updated

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by justmy2cents View Post
    did more testing and i'm yet to see any of my games to crash. although, once i saw the GPU bottleneck comment, i went and disabled double buffering in TR2013

    wine-1.7.10 => min 30.9 fps, with double buffering min 25.9
    command stream version => min 48.9/max 72.3, with double buffering 30.1/60

    looking at reported fps when running on windows with same GPU as me, it's about 80-90% there. that's freaking awesome change.

    now, question from complete n00b on directx department. is this able to be reused in DX10/11 and how much different those 2 are?
    I don't know about the architecture of dx3d but of wines, some parts of wine_direct_3d is version independent and as saw this changes where done in its core and not in the specific dxd39 parts: so yes.

    Comment


    • #32
      Originally posted by TemplarGR View Post
      You state the obvious here. Of course the problem with Wine is CPU performance, it is an emulation layer that translates d3d to opengl, it will introduce severe cpu overhead... You didn't need tests to figure that out...
      Always test your assumptions. Never make decisions based on gut feelings and hearsay you cannot verify. At very least a benchmark will serve as a backup if you get into discussions with a jerk like me who wants to make sure you know what you're doing.

      Originally posted by TemplarGR View Post
      The problem with modern cpus, is that they cannot improve per core performance fast enough to keep up with gpu performance. So anything that reduces cpu strain will be vastly important from now on.
      This is not true as a general statement. Believe me, I've run plenty of benchmarks, and even the current single-threaded wined3d manages to exhaust some GPUs. It really depends on the application, the strength of the GPU and CPU, the application settings, here especially the resolution and antialiasing. An example here are the fairly weak Geforce 9600 and GTX 650M GPUs in my Macbooks. At HD resolutions with MSAA disabled the GPUs can't keep up, even with current Wine. Some game examples: StarCraft 2 lowest settings, all source engine games, not even ancient 3DMark2001.

      But overall I agree, and it's why I have written the command stream patches. Our problems are more on the CPU than GPU side. On the GPU we reach pretty much native speed on Nvidia cards with the proprietary driver. r600g doesn't get anywhere close, and neither do fglrx or OSX. See my FOSDEM presentation from last year.

      Originally posted by TemplarGR View Post
      That is why AMD introduced MANTLE afterall.
      I don't know. All I've heard so far about MANTLE is hype. I have not seen any API documentation and have not been able to run any benchmarks myself. All I can think of it right now is that AMD is trying to establish its own proprietary API to get some vendor lock-in. As a game developer, I would be very careful before using it.

      Originally posted by TemplarGR View Post
      So, Wine could use anything that can improve its performance. A D3D state tracker can eliminate this overhead altogether if properly coded. Although it is not multiplatform, it could provide a big boost for gallium users. Especially with modern games. Imagine games like Rome 2 total war. I am willing to bet a huge sum of money that Wine will face a tremendous challenge in trying to match its Windows performance...
      You'll have to specify your bet in more detail. There are dozens of factors that affect performance. E.g. my command stream work beats Windows in 3dmark2001 on nvidia. On other OSes / drivers there are driver problems. In other games, Wine has problems outside the d3d code. In some cases (e.g. good old Empire Earth 1) there are still problems in our d3d code that won't go away anytime soon.

      A statement like "Wine can match Windows performance" will have to be evaluated on a game by game, OS by OS, driver by driver basis for a LONG time to come. Maybe forever.

      My main point wrt a d3d9 mesa state tracker is not that it won't improve performance, but that it is, at the moment, an inefficient use of our time. The command stream which works everywhere has a much bigger impact. And no, we won't just put a random piece of code into our codebase. If we do that, we have to support and maintain it. Again, inefficient use of our time.

      Comment


      • #33
        Originally posted by justmy2cents View Post
        now, question from complete n00b on directx department. is this able to be reused in DX10/11 and how much different those 2 are?
        Yes, it can be reused. In fact, a big consideration in my work to merge this upstream is how this will interact with wined3d changes needed to support d3d10/11.

        Comment


        • #34
          Originally posted by stefandoesinger View Post
          Yes, it can be reused. In fact, a big consideration in my work to merge this upstream is how this will interact with wined3d changes needed to support d3d10/11.
          that's nice to hear. keep on the good work

          Comment


          • #35
            excuse me for stupid questions, is this patch-set have effect only for open-source drivers? & this github wine version and crossover 13 from codeveawers are identical?

            Comment


            • #36
              Originally posted by OnioWoess View Post
              excuse me for stupid questions, is this patch-set have effect only for open-source drivers? & this github wine version and crossover 13 from codeveawers are identical?
              i'd guess it is not working only for open source drivers. i'm using tarball from this github on NVidia blob and 2nd version i used was 1.7.10 tarball from wine site. if you noticed, performance was quite different for me. only patch that i applied on both was to get TR2013 working where you remove optimization for CreateEventExW

              no clue about 2nd question

              Comment


              • #37
                Well, I dunno what I'm doing wrong but I see almost no difference yet between standard wine (1.6) and this one (1.7.10 patched), clean install both on Linux Mint XFCE Petra Also, on example of these games running under wine and under winXP: Xonotic, Killing Floor and Half Life 2, I still have huge difference in their performance and loss approx half of FPS under wine + freezes. Added HKCU/Software/Wine/Direct3D/SCMT/="enabled" key - it changes nothing. Tested on my old PC with nVidia GTS450 1Gb on 319 drivers, AMD X2 4400+ 1750 Mhz
                Last edited by OnioWoess; 01-12-2014, 03:27 PM.

                Comment


                • #38
                  D3D9 mesa + wine

                  I working on porting D3D9 state tracker to Mesa 10.0. Then I'll proviede some patchset for mesa, wine and create Gentoo ebuilds. Anyway, it need some developer, because without it'll be unusable soon.

                  Comment


                  • #39
                    Originally posted by OnioWoess View Post
                    Added HKCU/Software/Wine/Direct3D/SCMT/="enabled" key - it changes nothing.
                    Probably because you misspelled it. It should be CSMT, not SCMT.

                    Comment


                    • #40
                      Got a freeze on EVE-online. Was running only one client.

                      Two core was stuck at 100% use for exefile.exe (with is the EVE client).
                      Might be a rglrx related bug or something else as I'm mining with the same computer.

                      Comment


                      • #41
                        Originally posted by stefandoesinger View Post
                        Well, our (the Wine developers') and the Mesa developers' time is limited. With regards to running d3d9 applications on Mesa we have the following options:
                        1. Make Wine work better on all OpenGL implementations.
                        2. Make Mesa run all OpenGL applications better.
                        3. Write and maintain lots of special code to make Wine run better on Mesa.

                        Considering finite resources, we believe 1 is the way to go, and we're helping the Mesa devs with 2. You may disagree and submit code to either project to implement 3. But don't think it's a conspiracy when we disagree with you about what to do with our time.


                        I have MESA and Wine compiled by me with the D3D9 state tracker. I just don't know why all the rest they don't. And they need to do the same as me. I didn't ask you why you don't work to improve the state tracker. We have seen what a single person can do if he is technologically and ethically right (a state tracker). So i don't really want to ask anything about your time. There are distros and individuals responsible to have those packages in their repository.

                        Comment


                        • #42
                          Originally posted by stefandoesinger View Post
                          The easy work is done. The hard part that separates a proof of concept from code that is stable and maintainable in the long term is remaining. In other words, following the 80/20 rule, 80% of the work still needs to be done.

                          A good start would be to quantify the performance difference between wined3d and gallium-nine with reproducible benchmarks and then isolate where the performance difference is coming from. And that means not just "it reduces overhead, so it's faster", but something like "There's CPU-side overhead in module X, and GPU-side overhead because shader instructions A, B and C are inefficiently handled by module Y".

                          If it turns out that there's a fundamental advantage to a gallium state tracker, and that it's not just working around some bugs in Mesa and Wine that could be fixed with e.g. a better GLSL compiler or adding one or two focused GL extensions to support some d3d-isms better the next task is finding a stable API exposed by gallium-nine and used by Wine.

                          Matteo has done some testing with gallium-nine on r600g. If I remember correctly, he saw a moderate performance gain (~15% or something), but attributed most of that to insufficient optimization in the driver's GLSL compiler. I'll ask him to make sure my memory serves me right.

                          Thats weird, i have mostly a 2x gain with i5 and 3x+ with Core2. This thing offloads the Cpu dramatically, and some advanced shaders that are not even run with Wine, are now work. Its like a GLSL=disabled plus Threaded-Optimisations plus+++ all together.

                          Comment


                          • #43
                            Gallium nine still ALIVE!

                            I wrote small article about it, it work with mesa 10.0 and lastest wine. DEVELOPERS NEEDED So, if you want help with some small patch, some improvment, fix, everything is welcomed!

                            I'll try keep it alive, any help appriciated!

                            https://ixit.cz/faster-wine-games-wi...-gallium-nine/

                            Comment


                            • #44
                              Originally posted by stefandoesinger View Post
                              Probably because you misspelled it. It should be CSMT, not SCMT.
                              Yup, CSMT, certainly, misspelled only here in my post. Maybe I should have that <enabled> in quotes? ))

                              Originally posted by artivision View Post
                              Thats weird, i have mostly a 2x gain with i5 and 3x+ with Core2. This thing offloads the Cpu dramatically, and some advanced shaders that are not even run with Wine, are now work. Its like a GLSL=disabled plus Threaded-Optimisations plus+++ all together.
                              Please, please, may you tell me how? I cannot do that.

                              Comment


                              • #45
                                Originally posted by artivision View Post
                                Thats weird, i have mostly a 2x gain with i5 and 3x+ with Core2. This thing offloads the Cpu dramatically, and some advanced shaders that are not even run with Wine, are now work. Its like a GLSL=disabled plus Threaded-Optimisations plus+++ all together.
                                This is my opinion as well, and i don't need tests to be convinced...

                                A d3d state tracker can make wine extremely efficient for games. Granted, it is only for gallium and there may be patent fears etc. But i don't believe it requires much work.

                                The way i see things, Wine and Crossover developers have certain target groups and Linux users don't rank highly on that list... How many times has it happened? A project that could be used to improve Wine for Linux not getting accepted?

                                Wine prefers the Mac platform, it is obvious. I am willing to bet that if Apple had gone the gallium3d way, Wine would support d3d trackers ASAP.

                                I didn't want to say this earlier, because i wanted to read Stefan's opinion first. And he just provided us with excuses that have no technical merit at all.

                                No one asked them to develop or maintain a d3d9 state tracker. We just asked the option to use it to be included officially... It is not much work and it could be a compile time option.

                                I am pretty sure MESA devs wouldn't mind to include it, if there was a use for it. And the only use for it would be if WINE supported it. So this is a chicken and egg problem, WINE devs won't support it because it is not for Apple systems with excuses like "it needs maintainance", while MESA devs can't mainline it if there is no software using it...

                                Comment

                                Working...
                                X