Announcement

Collapse
No announcement yet.

Mesa OpenGL Threading Now Ready For Community Testing, Can Bring Big Wins

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    I think whatever clearly improve framerate on whatever CPU results are fine.

    Is it low/mid/high end CPU, does not matter

    Comment


    • #52
      Originally posted by dungeon View Post

      It can't be used as that, as no one can know or guess app perfromance even without running an app

      So is easy to write AI if you have algo to guess that, but to guess perf without running an app is sort of impossible

      This threading can be enabled by default, but again it is disabled because it is tested and known to degrade something else... it is same on any GL driver including nVidia... so say it could be enabled if you wait enough at the time that no one cares what it degrade or so
      It can be also enabled/disabled per case and not per game, depending on what threads the game asks and what processor you have. Also we need a control panel.

      Comment


      • #53
        These who can't edit xml file should write control panel to edit it

        Intel and AMD released hundreds of CPUs in last decade or so, so you wanna write AI so that any of these runns optimaly on any of thousands of games, even when some CPUs are not enough for some games
        Last edited by dungeon; 10 July 2017, 07:37 AM.

        Comment


        • #54
          Quick question. This OpenGL Threadingb working on R600? I mean HD5850 or HD5650m? Should I try test it? marek

          Comment


          • #55
            Originally posted by shmerl View Post

            Yes, it does, but it can also collide with CSMT, which is a related idea. They can step on each other and make things worse. So test for yourself, each case can be different.
            This is confusing as hell, if I understand it right, it colides with wine D3D-CSMT implementation that is already included in wine-staging, but what about gallium nine internal multithreading (csmt_force=1, it should be enabled by default since Mesa 13.2 or 17.0), if I understand it right, in case of gallium nine, this option have no influence and does not colide?

            Also, on topic, would this configuration work?

            Code:
            <application name="Default">
               <option name="mesa_glthread" value="true"/>
            </application>
            And individual blacklisting of specific games further?
            Last edited by leipero; 10 July 2017, 08:23 AM.

            Comment


            • #56
              Originally posted by spstarr View Post
              What this tells me is OpenGL is crap, but we knew that. The inconsistency that it has allowed developers/engines to create such hacks and a mess is why Vulkan must take over and soon, I just hope Vulkan forces developers/engine developers into conformance.
              I totally agree with you, but unfortunately my current impression is that isn't happening. There are already boatloads of vendor extensions, and it looks like it's getting worse.

              Comment


              • #57
                Originally posted by duby229 View Post

                I totally agree with you, but unfortunately my current impression is that isn't happening. There are already boatloads of vendor extensions, and it looks like it's getting worse.
                what makes you think this is a bad thing? Vendors get to test their codes on opengl and make innovations that got into opengl long before DX.

                Comment


                • #58
                  Originally posted by smitty3268 View Post
                  That's good, because you're wrong. That's not the way this works.
                  And you know this because...?
                  You can't prove a negative. Feel free to confirm for yourself that the hardware doesn't matter that much if you feel you need to.
                  How is it that you still don't understand what I'm talking about? It seems no matter what I say, you seem to think I'm saying something completely different. Where did I say that the hardware doesn't matter? I explicitly said that it DOES matter. That's the sole reason I commented in the first place. Seriously, I feel like you're arguing with me just to argue with me, because everything you think I'm saying that's wrong or stupid I also think is wrong or stupid. If you think something is really that dumb, perhaps maybe you should take a moment to think it through and give me the benefit of the doubt that I'm not a complete idiot.

                  I get the feeling someone could respond with a test on 20 different systems and you'd just claim that it's 1 game and doesn't prove other games aren't different.
                  Seriously, where are you getting this stuff from? What is your gripe with me? I don't know how to make this any clearer:
                  For a change that affects performance at the hardware level, You can't jump to conclusions that a game is the problem if you only tested on one piece of hardware. I really don't know how to get that message through to you. If someone were to test 3 of the regressed games on an 8 core CPU, that's the only proof we need. If they still have performance regressions (which they could) then my concern is apparently irrelevant. But who does it hurt to perform 3 tests just to prove a point? Meanwhile, if we jump to conclusions that these games will always regress, regardless of the CPU or GPU being used, that could be hurting performance for someone. So it's worth testing. Jeez, it's not like I'm asking anyone to devote a half day's work...

                  If you can come up with a single test that proves you are correct, please let me know and I'll start taking you seriously.
                  ...that's what I'm asking Marek to do (or anyone really). Just test a couple of the regressed games on a better CPU, but otherwise everything else can remain the same. That's it. Is that really so much to ask? I don't have a Fury X so I can't test it.

                  Maybe you'd take me seriously if you stopped assuming stupid things I never implied.

                  Comment


                  • #59
                    Originally posted by spstarr View Post
                    What this tells me is OpenGL is crap, but we knew that. The inconsistency that it has allowed developers/engines to create such hacks and a mess is why Vulkan must take over and soon, I just hope Vulkan forces developers/engine developers into conformance.
                    If your gripe is OpenGL put enough flexibility in developers' hands as to create inconsistencies, how does Vulkan, which put even more flexibility in the same hands, fixes the problem?

                    Comment


                    • #60
                      Originally posted by bug77 View Post

                      If your gripe is OpenGL put enough flexibility in developers' hands as to create inconsistencies, how does Vulkan, which put even more flexibility in the same hands, fixes the problem?
                      But, isn't that exactly what made OpenGL hard to implement and hard to support? What would make you think taking Vulkan down that path would be a good thing?

                      Comment

                      Working...
                      X