Announcement

Collapse
No announcement yet.

AMD Radeon R9 290 On Ubuntu 14.04 With Catalyst Can Beat Windows 8.1

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by tmpdir View Post
    I said "not largely shared", "Largely not shared" feels like an completly different statement. The first could mean 65% is shared while the second probably means 35% is shared.

    But I stand corrected, makes me look different at opengl test results and changes.
    Whoops, you're right. Didn't mean to reverse the words. That said, I don't like either phrase

    I would translate "not largely shared" to something less than 50% (say 40-45) and "largely not shared" to something much less, maybe 20-25%. AFAIK the code sharing is still over 50%, although certainly lower than it used to be before MS moved the graphics memory manager and related code from the driver to the OS starting with Vista.
    Test signature

    Comment


    • #12
      I have always wondered why it seemed that the free drivers were much closer in performance in FOSS games compared to commercial ones. It's starting to dawn on me that the proprietary drivers probably have tons of hacks and replacement-shaders for any commercial gaming/benchmark engine to artificially boost their FPS results. That is probably also part of their "secret sauce" that they absolutely cannot give up (ie. through open sourcing).

      Comment


      • #13
        Originally posted by Ancurio View Post
        I have always wondered why it seemed that the free drivers were much closer in performance in FOSS games compared to commercial ones. It's starting to dawn on me that the proprietary drivers probably have tons of hacks and replacement-shaders for any commercial gaming/benchmark engine to artificially boost their FPS results. That is probably also part of their "secret sauce" that they absolutely cannot give up (ie. through open sourcing).
        Sir you forgot your tinfoil hat.

        Comment


        • #14
          Originally posted by grndzro View Post
          Sir you forgot your tinfoil hat.
          Originally posted by http://richg42.blogspot.de/2014/05/the-truth-on-opengl-driver-quality.html
          Historically, this vendor (Nvidia) will do things like internally replace entire shaders for key titles to make them perform better (sometimes much better). Most drivers probably do stuff like this occasionally, but this vendor will stop at nothing for performance. What does this mean to the PC game industry or graphics devs? It means you, as "Joe Graphics Developer", have little chance of achieving the same technical feats in your title (even if you use the exact same algorithms!) because you don't have an embedded vendor driver engineer working specifically on your title making sure the driver does exactly the right thing (using low-level optimized shaders) when your specific game or engine is running. It also means that, historically, some of the PC graphics legends you know about aren't quite as smart or capable as history paints them to be, because they had a lot of help.
          This should be common knowledge, actually.

          Comment


          • #15
            Originally posted by Ancurio View Post
            I have always wondered why it seemed that the free drivers were much closer in performance in FOSS games compared to commercial ones. It's starting to dawn on me that the proprietary drivers probably have tons of hacks and replacement-shaders for any commercial gaming/benchmark engine to artificially boost their FPS results. That is probably also part of their "secret sauce" that they absolutely cannot give up (ie. through open sourcing).
            Yep you are 100% right, those blob guys always play out of specs - but that is business .

            Mantle? You can have it. Where? You know. But why? That is business .
            Last edited by dungeon; 18 May 2014, 09:01 PM.

            Comment


            • #16
              And also read a history of Mesa3D, maybe that is why mesa traditionaly plays good on Quake engine games .

              May 13, 1999

              May 1999 - John Carmack of id Software, Inc. has made a donation of
              US$10,000 to the Mesa project to support its continuing development.
              Mesa is a free implementation of the OpenGL 3D graphics library and id's
              newest game, Quake 3 Arena, will use Mesa as the 3D renderer on Linux.

              The donation will go to Keith Whitwell, who has been optimizing Mesa to
              improve performance on 3d hardware. Thanks to Keith's work, many
              applications using Mesa 3.1 will see a dramatic performance increase
              over Mesa 3.0. The donation will allow Keith to continue working on
              Mesa full time for some time to come.

              For more information about Mesa see www.mesa3d.org. For more
              information about id Software, Inc. see www.idsoftware.com.

              --------------------------------

              This donation from John/id is very generous. Keith and I are very
              grateful.
              And many people knows that Mesa is well tested and plays well but only on those engines, others are some kind of rough edge scenario .
              Last edited by dungeon; 18 May 2014, 09:18 PM.

              Comment


              • #17
                I and Tesseract dev thinking roughly the same

                Mesa seems to have a bunch of weird inexplicable crashes in the driver with anything except newer Intel hardware. I don't think most of the paths in Mesa beyond what essentially Quake engines would use have been exercised much...

                The fact that it was trying to compile the largest shader in the game at the time of the crash is rather suspicious.

                Comment


                • #18
                  In a way, who cares? Almost nobody is gaming on Windows using opengl. It would be interesting to compare d3d games that have opengl linux counterparts. I bet linux would come up short.

                  Comment


                  • #19
                    Originally posted by bridgman View Post
                    Whoops, you're right. Didn't mean to reverse the words. That said, I don't like either phrase

                    I would translate "not largely shared" to something less than 50% (say 40-45) and "largely not shared" to something much less, maybe 20-25%. AFAIK the code sharing is still over 50%, although certainly lower than it used to be before MS moved the graphics memory manager and related code from the driver to the OS starting with Vista.
                    We both translate it differently, but thats what happens with vague terms like 'largely'... not mentioning word order .

                    Still over 50% sounds pretty good, although I was hoping on a higher percentage. Any possibilities or plans increase this further, especialy now opengl is getting more important for crossplatform develop and game engines targeting this posibility?

                    Comment


                    • #20
                      Originally posted by molecule-eye View Post
                      In a way, who cares? Almost nobody is gaming on Windows using opengl. It would be interesting to compare d3d games that have opengl linux counterparts. I bet linux would come up short.
                      I expect opengl getting increasing important, especially for crossplatform and mobile development. Besides supporting two layers in a game engine, there's also the testing aspect. Its a lot of work to test crosplatform software, especially considering they have to test it for each gpu manufacturer and each gpu for opengl and directx, both having their own problems. For simpler games I wouldn't be surprised if a developer would choose to only support opengl if he's targetting crossplatform. Testing and debugging is expensive.

                      Comment

                      Working...
                      X