Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 24

Thread: AMD Radeon R9 290 On Ubuntu 14.04 With Catalyst Can Beat Windows 8.1

  1. #11
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,514

    Default

    Quote Originally Posted by tmpdir View Post
    I said "not largely shared", "Largely not shared" feels like an completly different statement. The first could mean 65% is shared while the second probably means 35% is shared.

    But I stand corrected, makes me look different at opengl test results and changes.
    Whoops, you're right. Didn't mean to reverse the words. That said, I don't like either phrase

    I would translate "not largely shared" to something less than 50% (say 40-45) and "largely not shared" to something much less, maybe 20-25%. AFAIK the code sharing is still over 50%, although certainly lower than it used to be before MS moved the graphics memory manager and related code from the driver to the OS starting with Vista.

  2. #12
    Join Date
    Aug 2011
    Posts
    542

    Default

    I have always wondered why it seemed that the free drivers were much closer in performance in FOSS games compared to commercial ones. It's starting to dawn on me that the proprietary drivers probably have tons of hacks and replacement-shaders for any commercial gaming/benchmark engine to artificially boost their FPS results. That is probably also part of their "secret sauce" that they absolutely cannot give up (ie. through open sourcing).

  3. #13
    Join Date
    Apr 2014
    Posts
    154

    Default

    Quote Originally Posted by Ancurio View Post
    I have always wondered why it seemed that the free drivers were much closer in performance in FOSS games compared to commercial ones. It's starting to dawn on me that the proprietary drivers probably have tons of hacks and replacement-shaders for any commercial gaming/benchmark engine to artificially boost their FPS results. That is probably also part of their "secret sauce" that they absolutely cannot give up (ie. through open sourcing).
    Sir you forgot your tinfoil hat.

  4. #14
    Join Date
    Aug 2011
    Posts
    542

    Default

    Quote Originally Posted by grndzro View Post
    Sir you forgot your tinfoil hat.
    Quote Originally Posted by http://richg42.blogspot.de/2014/05/the-truth-on-opengl-driver-quality.html
    Historically, this vendor (Nvidia) will do things like internally replace entire shaders for key titles to make them perform better (sometimes much better). Most drivers probably do stuff like this occasionally, but this vendor will stop at nothing for performance. What does this mean to the PC game industry or graphics devs? It means you, as "Joe Graphics Developer", have little chance of achieving the same technical feats in your title (even if you use the exact same algorithms!) because you don't have an embedded vendor driver engineer working specifically on your title making sure the driver does exactly the right thing (using low-level optimized shaders) when your specific game or engine is running. It also means that, historically, some of the PC graphics legends you know about aren't quite as smart or capable as history paints them to be, because they had a lot of help.
    This should be common knowledge, actually.

  5. #15
    Join Date
    Feb 2008
    Posts
    1,083

    Default

    Quote Originally Posted by Ancurio View Post
    I have always wondered why it seemed that the free drivers were much closer in performance in FOSS games compared to commercial ones. It's starting to dawn on me that the proprietary drivers probably have tons of hacks and replacement-shaders for any commercial gaming/benchmark engine to artificially boost their FPS results. That is probably also part of their "secret sauce" that they absolutely cannot give up (ie. through open sourcing).
    Yep you are 100% right, those blob guys always play out of specs - but that is business .

    Mantle? You can have it. Where? You know. But why? That is business .
    Last edited by dungeon; 05-18-2014 at 09:01 PM.

  6. #16
    Join Date
    Feb 2008
    Posts
    1,083

    Default

    And also read a history of Mesa3D, maybe that is why mesa traditionaly plays good on Quake engine games .

    May 13, 1999

    May 1999 - John Carmack of id Software, Inc. has made a donation of
    US$10,000 to the Mesa project to support its continuing development.
    Mesa is a free implementation of the OpenGL 3D graphics library and id's
    newest game, Quake 3 Arena, will use Mesa as the 3D renderer on Linux.

    The donation will go to Keith Whitwell, who has been optimizing Mesa to
    improve performance on 3d hardware. Thanks to Keith's work, many
    applications using Mesa 3.1 will see a dramatic performance increase
    over Mesa 3.0. The donation will allow Keith to continue working on
    Mesa full time for some time to come.

    For more information about Mesa see www.mesa3d.org. For more
    information about id Software, Inc. see www.idsoftware.com.

    --------------------------------

    This donation from John/id is very generous. Keith and I are very
    grateful.
    And many people knows that Mesa is well tested and plays well but only on those engines, others are some kind of rough edge scenario .
    Last edited by dungeon; 05-18-2014 at 09:18 PM.

  7. #17
    Join Date
    Feb 2008
    Posts
    1,083

    Default

    I and Tesseract dev thinking roughly the same

    Mesa seems to have a bunch of weird inexplicable crashes in the driver with anything except newer Intel hardware. I don't think most of the paths in Mesa beyond what essentially Quake engines would use have been exercised much...

    The fact that it was trying to compile the largest shader in the game at the time of the crash is rather suspicious.
    http://tesseract.gg/forum/viewtopic.php?id=46

  8. #18
    Join Date
    Sep 2009
    Posts
    203

    Default

    In a way, who cares? Almost nobody is gaming on Windows using opengl. It would be interesting to compare d3d games that have opengl linux counterparts. I bet linux would come up short.

  9. #19
    Join Date
    Aug 2008
    Location
    Netherlands
    Posts
    285

    Default

    Quote Originally Posted by bridgman View Post
    Whoops, you're right. Didn't mean to reverse the words. That said, I don't like either phrase

    I would translate "not largely shared" to something less than 50% (say 40-45) and "largely not shared" to something much less, maybe 20-25%. AFAIK the code sharing is still over 50%, although certainly lower than it used to be before MS moved the graphics memory manager and related code from the driver to the OS starting with Vista.
    We both translate it differently, but thats what happens with vague terms like 'largely'... not mentioning word order .

    Still over 50% sounds pretty good, although I was hoping on a higher percentage. Any possibilities or plans increase this further, especialy now opengl is getting more important for crossplatform develop and game engines targeting this posibility?

  10. #20
    Join Date
    Aug 2008
    Location
    Netherlands
    Posts
    285

    Default

    Quote Originally Posted by molecule-eye View Post
    In a way, who cares? Almost nobody is gaming on Windows using opengl. It would be interesting to compare d3d games that have opengl linux counterparts. I bet linux would come up short.
    I expect opengl getting increasing important, especially for crossplatform and mobile development. Besides supporting two layers in a game engine, there's also the testing aspect. Its a lot of work to test crosplatform software, especially considering they have to test it for each gpu manufacturer and each gpu for opengl and directx, both having their own problems. For simpler games I wouldn't be surprised if a developer would choose to only support opengl if he's targetting crossplatform. Testing and debugging is expensive.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •