Announcement

Collapse
No announcement yet.

Catalyst 10.1 Still Trash In Heaven, But Good News

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Kano View Post
    @bridgman

    The only hw vendor with a 10-1 driver is ATI, so why was it NV now?
    N

    D

    A

    Comment


    • #32
      The Unigine rep didn't actually mention Cat 10.1, if you read the fine print. The rest of the paragraph reads like supposition. It may actually be the case (I don't know), but the statement from Unigine wasn't "the driver is not ready" or "ATI/AMD asked us..." but rather "a hardware vendor asked us...".

      It just seemed a bit odd to me...
      Test signature

      Comment


      • #33
        "We don't want to release our program for Linux until Linux can run it well."

        Is the program even ready and that is simply an excuse? Are you afraid it will look bad? Or are you joining with a hardware vendor to release everything all at once to try to drum up more attention?

        I vote 3, if not all of them, but who knows. It's a CORPORATE MYSTERY.

        Comment


        • #34
          Originally posted by Yfrwlf View Post
          "We don't want to release our program for Linux until Linux can run it well."

          Is the program even ready and that is simply an excuse? Are you afraid it will look bad? Or are you joining with a hardware vendor to release everything all at once to try to drum up more attention?

          I vote 3, if not all of them, but who knows. It's a CORPORATE MYSTERY.
          I, for one, just think that they are co?perating with HW vendors to get it to work with all modern and capable cards that are out there and that one HW vendor told them to (better) wait a little longer because they haven't finnished fixing bug X or feature Y yet.

          I think it is a little to early to just start to speculate about it. Maybe it's the result of the UT3 disaster. (I can call it a disaster, right?)

          Comment


          • #35
            I think the most likely explanation is that AMD still wants the ability to modify their extension a bit if necessary. Once there's a demo out there that is actually trying to use it, it becomes much harder to justify changing the api and breaking compatibility even if the demo is just programmed against an unsupported pre-released driver.

            Comment


            • #36
              Originally posted by V!NCENT View Post
              I, for one, just think that they are co?perating with HW vendors to get it to work with all modern and capable cards that are out there and that one HW vendor told them to (better) wait a little longer because they haven't finnished fixing bug X or feature Y yet.
              Knowing how this kind of stuff works, odds are fairly good that you're right on the money on this line of thought...

              I think it is a little to early to just start to speculate about it. Maybe it's the result of the UT3 disaster. (I can call it a disaster, right?)
              Call it what you like. I know I'm not thinking it was a success...

              Comment


              • #37
                Originally posted by rohcQaH View Post
                As far as you're concerned, they do. fglrx is mostly maintained for the workstation customers (and has to be maintained for them), but AMD isn't throwing lots of man-power at it to quickly implement consumer-features like video acceleration or shiny, but useless heaven demos.
                Uh... The first might be useless for Workstation work, but don't kid yourself about the second not being very, very useful.

                Comment


                • #38
                  Originally posted by agd5f View Post
                  It depends largely on what GPU the game was mainly developed up on. The 3D API are pretty loose in a lot of corner cases. That's why it's important to be first with new DX/GL HW since more game vendors will tend to use your hardware and expect your driver's behavior in those gray areas.
                  If what you say is true, then I suspect the GPU landscape is about to change significantly. ATI's DirectX 11 cards have been on the market for almost 5 months now, while nvidia's performance line doesn't even support DirectX 10.1. Does this mean the next generation of games will be designed to work with ATI's gray areas? That could be huge, especially considering ATI already holds the power efficiency crown with the HD 5850.

                  Interesting.

                  Comment


                  • #39
                    Originally posted by Joe Sixpack View Post
                    If what you say is true, then I suspect the GPU landscape is about to change significantly. ATI's DirectX 11 cards have been on the market for almost 5 months now, while nvidia's performance line doesn't even support DirectX 10.1. Does this mean the next generation of games will be designed to work with ATI's gray areas? That could be huge, especially considering ATI already holds the power efficiency crown with the HD 5850.

                    Interesting.
                    Very very interesting... The fact that ATI could be a head by a large margin would turn the tables for them...

                    Comment


                    • #40
                      Originally posted by Joe Sixpack View Post
                      If what you say is true, then I suspect the GPU landscape is about to change significantly. ATI's DirectX 11 cards have been on the market for almost 5 months now, while nvidia's performance line doesn't even support DirectX 10.1. Does this mean the next generation of games will be designed to work with ATI's gray areas? That could be huge, especially considering ATI already holds the power efficiency crown with the HD 5850.

                      Interesting.
                      It's not as if this hasn't happened before. Ati was the first one with DX9a hardware out and many games were built with that hardware in mind. Of course, it didn't help that Nvidia's late FX series was architecturally unsuitable to DX9 shaders, to the point that Half-Life 2 fell back to DX8.1 on such cards.

                      Comment

                      Working...
                      X