Announcement

Collapse
No announcement yet.

AMD Radeon HD 5750/5770

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by bridgman View Post
    The choise wasn't really that simple. The open source drivers were already better in everything except 3D and that covered relatively more users, so we took the open source route. That was probably the best choice for most users, but people who had recently purchased high end 5xx boards for use with Linux probably would not agree.
    Exactly, you left your fans that spent that extra cash for the "better" solution out in the cold after a very short while basically leaving them with a useless piece of hardware that they spend extra on to get that functionality.

    Comment


    • Yes, but with a couple of obvious qualifiers. I don't think we had a lot of fans buying high end ATI hardware primarily for use with Linux back in 2005 and 2006. I have run across a few in the last year but probably not more than I could count on one hand.

      Most of the high end 5xx buyers were gaming on Windows even if they ran Linux the rest of the time, and nothing changed for them. Remember this was before we had even mentioned the new OpenGL driver, so 3D performance for consumer apps was a lot slower than it was today.

      Also I would hardly call it "leaving them with a useless piece of hardware" - maybe we can settle on "inconvenient" ?
      Test signature

      Comment


      • I try to rephrase the question because I think the answer was a bit 'vague:

        If I don't want (for various reasons) a LTS distro and today I buy a HD5870 (but it could be any other card because this is only an example) and I use it with Cinelerra or Blender or any hypotetical game which uses intensive 3D at peak performance (making also full use of OpenGL) for, hypothetically, about 3 or 4 years, I'd be very disappointed if once deprived of the Catalyst support, and forced to move to open drivers I end up with drastically reduced performances!

        Therefore, what I have to expect for the future (2/3 years)?
        Is likely or not a scenario where I could continue to use my card without losing performance and features (especially with reference to 3D)?

        I think and hope I will cease to haunt you and this will be my latest question if I will get a full answer.

        Thanks in advance.

        Comment


        • Originally posted by bridgman View Post
          Yes, but with a couple of obvious qualifiers. I don't think we had a lot of fans buying high end ATI hardware primarily for use with Linux back in 2005 and 2006. I have run across a few in the last year but probably not more than I could count on one hand.

          Most of the high end 5xx buyers were gaming on Windows even if they ran Linux the rest of the time, and nothing changed for them. Remember this was before we had even mentioned the new OpenGL driver, so 3D performance for consumer apps was a lot slower than it was today.

          Also I would hardly call it "leaving them with a useless piece of hardware" - maybe we can settle on "inconvenient" ?
          You are forgetting that it doesn't matter how high-end or not a GPU is for users to have a crappy experience. It's not getting better just because someone has a low-end GPU; it sucks for them in the same way as with a high-end card.

          The way you put it (though not intended) sounds a bit like owners of low-end hardware will have a better experience then those with high-end one.

          Comment


          • Originally posted by bridgman View Post
            Also I would hardly call it "leaving them with a useless piece of hardware" - maybe we can settle on "inconvenient" ?
            Well lets put it this way, because of that fiasco, we had to replace 2 computer labs at the community college that had RS690/SB600 that were running linux. They were "useless pieces of hardware". Needless to say, when we put out the bid to tender specified no "AMD chipsets.".

            Comment


            • I might also add that the machines were not even 14 months old at that time.

              Comment


              • We know...

                In September 2006. ATI was dropped fglrx support for R200>= chips
                In April 2009. AMD was dropped fglrx support for R500>= chips

                It was 31 month in between, so according to this meteorology next earthquake could be expected...

                In October 2011. AMD will drop fglrx support for R800>= chips

                How time flies

                Comment


                • OK i must say... I still have 9250 card and from september 2006 till today performance of r200 Mesa driver is practicaly the same (60%<= (in the best case scenario) compared to fglrx or even worse with kms). And it always freeze linux in Call of Duty (Wine) - quake 3 engine based game. With R200_NO_TCL works with no freeze but very few fps, with fglrx goes just like in Windows.

                  In 2D/Xv area i can't complain, it was always been acceptible, good or even best.

                  Comment


                  • Originally posted by RealNC View Post
                    You are forgetting that it doesn't matter how high-end or not a GPU is for users to have a crappy experience. It's not getting better just because someone has a low-end GPU; it sucks for them in the same way as with a high-end card.
                    The difference is that if they had a low end card then Linux gaming was not likely to be the primary usage of the system... and most of the other common usage scenarios already worked as well or better with the open source drivers.
                    Test signature

                    Comment


                    • Originally posted by bingel View Post
                      I try to rephrase the question because I think the answer was a bit 'vague:

                      If I don't want (for various reasons) a LTS distro and today I buy a HD5870 (but it could be any other card because this is only an example) and I use it with Cinelerra or Blender or any hypotetical game which uses intensive 3D at peak performance (making also full use of OpenGL) for, hypothetically, about 3 or 4 years, I'd be very disappointed if once deprived of the Catalyst support, and forced to move to open drivers I end up with drastically reduced performances!

                      Therefore, what I have to expect for the future (2/3 years)?
                      Is likely or not a scenario where I could continue to use my card without losing performance and features (especially with reference to 3D)?

                      I think and hope I will cease to haunt you and this will be my latest question if I will get a full answer.

                      Thanks in advance.
                      Nobody can tell you what is going to happen 4-5 years from now at AMD, at NVidia, or any other company in this industry. If someone tries to tell you what will happen then, I suggest you ignore them and walk away

                      You know the support timetable we generally aim for, and you know that we tend to drop along DX generation boundaries (3xx-5xx were all DX9, for example).

                      My guess is that the next cutoff would happen between DX10 (HD3xxx/HD4xxx) and DX11 (HD5xxx), and the 3D code is structured to make that possible, but I have no way of telling when that will happen or what our support policy will be when it happens.
                      Test signature

                      Comment

                      Working...
                      X