Announcement

Collapse
No announcement yet.

The Highly-Anticipated XCOM 2 Game For Linux Will Be NVIDIA-Only

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • bridgman
    replied
    Originally posted by bug77 View Post
    It depends on what checking are you dropping. Somehow I suspect nvidia's drivers are a "free for all" type of affair.
    And about what I want, I only want a video card that works. And I have that already.
    The checking I am talking about dropping is the checking you referred to when you said "become more permissive". Whatever that was.

    If you want a video card that works then you shouldn't be talking about other driver vendors dropping checking. If any other vendor made free-for-all drivers then you would suddenly start seeing code developed on those drivers breaking on NVidia, since (all together now) "there is no standard for non-standard behavior.

    If all vendors are strict everything works. If one vendor has lax checking everything works for them but not for anyone else. If more than one vendor has lax checking then everything falls apart very quickly.
    Last edited by bridgman; 03 February 2016, 12:53 PM.

    Leave a comment:


  • bug77
    replied
    Originally posted by bridgman View Post
    But of course if you drop checking then you get implementation-specific behaviour (HW-specific as well as driver-specific), not consistent behaviour.

    That's not what you want.
    It depends on what checking are you dropping. Somehow I suspect nvidia's drivers are a "free for all" type of affair.
    And about what I want, I only want a video card that works. And I have that already.

    Leave a comment:


  • bridgman
    replied
    If you drop checking then you get implementation-specific behaviour (HW-specific as well as driver-specific), not consistent behaviour. That's not what you want.

    The reason for having standards in the first place is to allow portability between implementations.

    The ideal model would be to have strict checking during development but relaxed checking during deployment, which is where Vulkan and some of the GL implementations are going. At the moment it's pretty much the other way round -- lax checking during development then "gee too bad it doesn't work on all the other implementations" during deployment.
    Last edited by bridgman; 03 February 2016, 01:34 PM.

    Leave a comment:


  • bug77
    replied
    Originally posted by Serafean View Post

    And intel, broadcom etc should too? IMO if there is a single ugly duckling, it should get the boot. If most players implement the spec in a specific different way, the spec should be changed. I honestly don't see making nvidia's opengl the standard as a solution... And AFAIK it's "easier" to develop for because it is less restrictive, whether less restrictive GL spec is good or bad is up to GL experts, but when programming, I sure prefer a stricter compiler to a lenient one (I get less surprises)...

    Profiles : imagine there would be CPU profiles... I mean what the hell? Why is this even necessary? Bottlenecks should be found and evicted, working around them through profiles is kicking the problem down the road.
    If being strict incurs such overhead that developers would rather drop AMD and intel support altogether, yes, become more permissive. Cause it probably means there's something wrong with the standard.

    Leave a comment:


  • Serafean
    replied
    Originally posted by bug77 View Post
    And yes, if their out-of-spec APIs really make it that mush easier to work with their hardware, AMD should copy those too.
    And intel, broadcom etc should too? IMO if there is a single ugly duckling, it should get the boot. If most players implement the spec in a specific different way, the spec should be changed. I honestly don't see making nvidia's opengl the standard as a solution... And AFAIK it's "easier" to develop for because it is less restrictive, whether less restrictive GL spec is good or bad is up to GL experts, but when programming, I sure prefer a stricter compiler to a lenient one (I get less surprises)...

    Profiles : imagine there would be CPU profiles... I mean what the hell? Why is this even necessary? Bottlenecks should be found and evicted, working around them through profiles is kicking the problem down the road.

    Leave a comment:


  • bug77
    replied
    Originally posted by Herem View Post

    I think AMD are more concerned with bringing the open drivers up to feature parity with Catalyst before working on individual game profiles. I'm pretty sure most people would rather new OpenGL features weren't delayed by diverting resources to get a few more fps in the latest games.
    Since you can't play properly until you have both "OpenGL features" and game profiles, the order in which they are added is pretty much irrelevant.
    The real issue here is that there's no plan to add profiles at all. There is support for profiles, but actually writing and validating profiles is not a scheduled item.

    Leave a comment:


  • directhex
    replied
    Originally posted by chris200x9 View Post
    Welp I'm off to cancel my pre-order, kind of bummed to because I got the digital deluxe for $56 on GMG but I cannot support this type of behavior.

    Why should Feral care? They don't get paid a cent for GMG purchases.

    Leave a comment:


  • Herem
    replied
    Originally posted by bug77 View Post
    Yes, that's a big part, too. Nvidia has profiles for like every title. The open AMD driver doesn't and it won't have anytime soon. Since no one expects Catalyst on Windows to ship without proper profiles, I don't know why AMDGPU would be allowed to do it just because it's open.
    I think AMD are more concerned with bringing the open drivers up to feature parity with Catalyst before working on individual game profiles. I'm pretty sure most people would rather new OpenGL features weren't delayed by diverting resources to get a few more fps in the latest games.

    Leave a comment:


  • bug77
    replied
    Originally posted by bridgman View Post

    This is where you might not be hearing what others are saying. You are basically saying "don't buy AMD until they accurately emulate the undocumented out-of-spec API usage that NVidia drivers allow, so that games developed only on NVidia will always work on AMD with zero effort even if the app does not follow the OpenGL spec".

    I'm not sure that's what you really want, unless you work for NVidia.
    You know very well this not all about "undocumented out-of-spec API", but also about sub-par OpenGL performance.
    What I'm saying when I buy Nvidia is that I don't care about internal implementation details or whether open software is better than closed (I know that it is). I care about working products first and foremost. I've been told time and again that Nvidia is evil because it's closed and non-standard, yet time and again titles released for Linux exclusively work with Nvidia's blobs. And yes, if their out-of-spec APIs really make it that mush easier to work with their hardware, AMD should copy those too.

    Leave a comment:


  • eydee
    replied
    Why do they let Feral butcher these games? VP would do it cheaper, and it would perform better.

    Leave a comment:

Working...
X