Announcement

Collapse
No announcement yet.

NVIDIA 375.10 vs. Linux 4.8 + Mesa 13.1-dev AMD GPU Benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • duby229
    replied
    Originally posted by johnc View Post

    There is no "penalty" in "visual fidelity" and "quality of rendered output".

    Try making up some other excuse.
    Dude, I can post a dozen more links proving it -again- and the truth still wouldn't matter one tiny little bit to you. So keep up the bliss.

    Leave a comment:


  • oooverclocker
    replied
    Originally posted by johnc View Post
    There is no "penalty" in "visual fidelity" and "quality of rendered output".
    At least there was some in the past. For example for Ashes of the Singularity. There have been cases when the Nvidia Windows driver reduced the instructions that have been passed for the card. Well, this will be pretty unlikely for the Nouveau driver of course.

    Another subtle point is that the mid range cards' VRAM is usually crippled somehow. So when the 970 or 1060's VRAM overflows most modern game engines dynamically reduce the detail of some textures. So the availability of much VRAM for cheaper cards doesn't just increase the lifespan but also prevents visual limitations. To be honest, from the visual point of view the games we have working on Linux are pretty much texture- crap, so they won't exceed your VRAM. This may change with the new Deus Ex and hopefully other games soon..

    Leave a comment:


  • Passso
    replied
    Originally posted by dungeon View Post
    And of course there are a differences between JPEG and raw images, between MP3 and raw audio... but thing is most people did not notice that difference
    If most people cannot see the difference it cannot be considered as "cheating", but more "optimization", and those who do not make it are "brute forcers"

    This is how the IT world improves!


    Leave a comment:


  • Passso
    replied
    Originally posted by johnc View Post

    Ohh stop.

    You people won't be satisfied until the colors on those graphs are completely reversed, then you'll be praising how representative of reality these benchmarks are.

    For April 1 Michael should run the same benchmarks but switch the video card labels and colors. The forum thread would be hilarious.

    The truth has always been that if you want open drivers on a discrete GPU you need to buy AMD. But you will have to accept a penalty in terms of performance and features. If you're willing to do that, fine. But don't lie to yourself and everyone else here and pretend that sacrifice isn't being made.
    Happy to see that there is some sane people on this forum. I felt so alone. I was just about to call a doctor.

    Leave a comment:


  • dungeon
    replied
    What to say bias on anything is reallity, same like cheating... which might lead to good inovations

    Leave a comment:


  • fuzz
    replied
    Originally posted by dungeon View Post
    So to be equal, than read this too



    Really we can say that they all cheating, but really they just trys to enhance things here and there
    Eh, if I wanted my driver to do things against my will I would buy a console.

    Funny how they even admit it. The entire point here is review websites don't take it into account, thus leading to bias.

    Leave a comment:


  • dungeon
    replied
    And of course there are a differences between JPEG and raw images, between MP3 and raw audio... but thing is most people did not notice that difference

    Where nVidia "cheating" the most is on Ti cards, yes all that is cheating... but is intended in a way that most people would not notice anything

    Leave a comment:


  • dungeon
    replied
    Originally posted by fuzz View Post
    Did you even look at the evidence presented? I thought it was bullshit until I looked.
    So to be equal, than read this too



    Really we can say that they all cheating, but really they just trys to enhance things here and there

    Even every compression is cheating, S3 cheated near 2 decades ago with their compression, but from that time most games uses that

    Leave a comment:


  • fuzz
    replied
    Originally posted by johnc View Post
    There is no "penalty" in "visual fidelity" and "quality of rendered output".

    Try making up some other excuse.
    Did you even look at the evidence presented? I thought it was bullshit until I looked.

    Leave a comment:


  • johnc
    replied
    Originally posted by duby229 View Post
    And the truth has always been just as much that if you want to buy nVidia hardware you must accept a penalty in terms of visual fidelity and quality of the rendered output. What you ignore is the issue of the "right" performance vs the "most" performance. You continue making that mistake and you will continue being wrong.
    There is no "penalty" in "visual fidelity" and "quality of rendered output".

    Try making up some other excuse.

    Leave a comment:

Working...
X