Announcement

Collapse
No announcement yet.

Has AMD Finally Fixed Tearing With Its Linux Driver?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Qaridarium
    there is never a soup without flys.
    Alas I suspect you're right for the majority of cases. The trick though is for the chef to hide the flies in amongst the vegetables so you don't notice them :-)

    Comment


    • #22
      One thing in AMD's favour though is they seem to have at lest gotten the hairy moths out of the soup at last.

      Comment


      • #23
        broken D2D and no XAA

        The Direct2D accel is still broken for me, and now the
        Code:
        aticonfig --set-pcs-str=DDX,ForceXAA,TRUE
        won't turn it off. It seems, that XAA was removed from this driver.

        Comment


        • #24
          Originally posted by monraaf View Post
          Bollocks. The fact is that Nvidia has managed to get tear-free playback with their Linux driver for god knows how long, and even the opensource ati driver has had it for quite some time now. It was just fglrx that didn't have it. So please spare me the nonsense about non stock configurations and whatever else excuses.

          A man walks into a restaurant and orders a bowl of soup. When the waiter delivers the soup to his table the man notices that there are at least a few dozen flies swimming in his soup. When the man complains to the waiter, the waiter fishes out one fly with a spoon and walks away. Now the man is not satisfied of course and wants to call the waiter again. Just as the man tries to raise his voice, allquixotic sitting on the table next to him interrupts:

          "Setting your standards high is a good thing, but there's no need to get such a sour attitude. Be happy, the waiter just fished one fly out of your soup, now Bon App?tit !"

          I never quite understood what drives people to become water carriers for ATI or insert any other company.
          Humorous rhetoric, but that's all it is -- rhetoric. Anyone who expects the 3d graphics driver industry to be anywhere in the ballpark of the level of maturity and service of a restaurant (or a car, for that matter) is out of their mind. 3d graphics drivers didn't exist until the mid to late 90s. Restaurants have existed since a few hundred years after the dawn of civilization, and cars have existed for over 100 years.

          And yes, in fact, early cars did not necessarily come with wipers. Early restaurants likely did not come with napkins (or utensils, for that matter). Early airplanes did not come with cabin pressure.

          People are too complacent and used to our everything's-perfect, homogenized lifestyle in the 21st century. People don't realize that things are only well-suited to our desires and our tastes because other people have developed methods, procedures, and technologies, and then set other people to work performing those methods and procedures, and creating instances of those technologies, to make it happen. There is a road to travel in order to get to the ideal level of service.

          If you would prefer that all 3d graphics drivers remain in-house, unreleased research projects until they are 99.99999% perfect, then please delete all instances of Catalyst, the NVIDIA binary driver, and all open source graphics drivers from your PC right now, and do not install them until, oh, 2025 or so at the earliest. For the open source drivers, I'd say 2050. And even then, there will be the occasional engine built with a flaw that leads to a head gasket breach after 50,000 miles, or the occasional hair or fly in your soup.

          Maybe AMD should call the Catalyst Linux driver a "public beta". Would that make you happy? Because that's more or less what it is, or at least, what it had been in 2010. Maybe 2011 will be the year of Catalyst actually crystallizing into a nice driver, but I think it'd be fair to call it beta in 2010, with the lack of tear-free support and a host of OpenGL bugs. A beta driver is like a "soup" scraped up from random kitchen scraps that were slated for the trash, called "chef's experiment" or something on the menu. By setting your expectations right, you don't get a sour attitude when something's in there that you don't find appealing.

          In fact, this whole issue with you seems to be about expectations. While I, too, believe that someday Catalyst (and the open source drivers too, I hope!) will enjoy asymptotic perfection, that day is not today. Should we hold AMD to blame for putting out a driver before it's ready? It's hard to say, "Well, you should have done better than you have, in less time, and delivered everything I want yesterday!" in an industry that is barely a decade old, to a team of at least 50 people (in the Catalyst case).

          Personally, my expectations in the 3d graphics drivers space have been relativistic, rather than based upon an idea of perfection. Take the best driver out there, account for its features and stability, and rank the remaining drivers in terms of the defects they have that are absent in the best driver, or features that the best driver has that the others don't.

          Now, I share the experience of others that the NVIDIA binary driver does not provide full desktop tear-free experience, especially on the 2d side, but also with 3d games. But, freedom aside, many would agree that the NVIDIA binary is currently the best graphics driver for Linux in terms of features, stability, and performance. So how does Catalyst stack up?

          1. They both support the same level of the OpenGL API. Damn vague Khronos specifications and interpretation mismatches aside, it seems that the few users of the latest OpenGL specs (ahem, Unigine) can run fairly well on Catalyst. And if you go back to, say, OpenGL 2.1, Catalyst does about as well as it possibly can; it's hard to expect more.

          2. Catalyst supports cross-API, full-screen tear-free, while NVIDIA doesn't (they only support tear-free within an individual API, e.g. OpenGL or composited desktop or a video, but not sync between all three at once). Looks like Catalyst actually wins this battle, now doesn't it?

          3. Multi-monitor is well-supported in Catalyst these days, and I know NVIDIA has their own proprietary solution that is supposed to work well.

          4. Perhaps there are video decoding issues with Catalyst under Xv and XvBA, whereas NVIDIA's VDPAU is supposed to be awesome. This is a fair point; Catalyst video playback could use some work. To me, this isn't an indispensable feature, because using OpenGL for video keeps the hardware acceleration aspect, while basically guaranteeing that a conforming OpenGL implementation will render a pixel-correct result. Since the performance (i.e., the smoothness of the framerate) is acceptable for me even at 1080p, I am satisfied.

          So on point 2, Catalyst "wins", and on point 4, there exists some ancient video API that Catalyst incorrectly implements while it correctly plays back video with another API instead. Boo hoo. From my perspective, Catalyst is actually better than the NVIDIA binary driver as of the 11.1 release. If you care about HW video decoding even when using OpenGL results in acceptable performance, that's like caring that your car (damn bad analogies) is a SOHC engine instead of DOHC. For all practical purposes, it's basically an academic point.

          Since you said "bollocks" to my attempt to dissuade you from car analogies, here's one more while we're at it: Catalyst is like a car that gets you from point A to point B. Earlier models lacked windshield wipers, but the 2011 model has those as well as XM radio, GPS, and improved airbags. It's gotten you from point A to point B since about 2007, but now it has all the features most ordinary users expect. But the difference is that, to make a proper car analogy, the year it started to get you from point A to point B was actually 1907, not 2007. (OK, the Model T wasn't available until late 1908, but taking it back exactly 100 years was fun for illustration purposes). And it's just "1911" now, and it already has GPS, wipers and XM radio? Holy crap, those Ford engineers move fast! :P

          Comment


          • #25
            Originally posted by allquixotic View Post
            It's gotten you from point A to point B since about 2007, but now it has all the features most ordinary users expect. But the difference is that, to make a proper car analogy, the year it started to get you from point A to point B was actually 1907, not 2007. (OK, the Model T wasn't available until late 1908, but taking it back exactly 100 years was fun for illustration purposes). And it's just "1911" now, and it already has GPS, wipers and XM radio? Holy crap, those Ford engineers move fast! :P
            You know the windshield wiper was invented in 1903 right?

            Comment


            • #26
              In response to the question "Has AMD Finally Fixed Tearing With Its Linux Driver?" asked by the thread title, I can give an answer for my system.





              This is with 3 x 24" LCDs, each @ their native resolutions of 1920x1200 with no rotation or anything fancy.

              Looks like I'll have to dump this adaptor after all. Seems 1 GB isn't enough.

              Comment


              • #27
                It gets better.

                When I was running the FirePro pre-release driver at least I could get GL video output frame locked if not the whole desktop.

                Now, each monitor has its own idea of where the V-Blank is so on each screen the tearing occurs at a different place. At lease the tear position doesn't roll up or down though like it used to

                Comment


                • #28
                  Originally posted by monraaf View Post
                  A man walks into a restaurant and orders a bowl of soup. When the waiter delivers the soup to his table the man notices that there are at least a few dozen flies swimming in his soup. When the man complains to the waiter, the waiter fishes out one fly with a spoon and walks away. Now the man is not satisfied of course and wants to call the waiter again. Just as the man tries to raise his voice, allquixotic sitting on the table next to him interrupts:

                  "Setting your standards high is a good thing, but there's no need to get such a sour attitude. Be happy, the waiter just fished one fly out of your soup, now Bon App?tit !"

                  I never quite understood what drives people to become water carriers for ATI or insert any other company.
                  Ahah funny, but nonsense.
                  If you believe what you said, everything is "flies into your soup", since anything is perfect. There are flies everywhere, you should eat no soup at all...

                  Comment


                  • #29
                    Originally posted by mugginz View Post
                    One thing in AMD's favour though is they seem to have at lest gotten the hairy moths out of the soup at last.
                    (Note to self: Don't read one of these threads while taking a drink... )

                    ROFLMAO!

                    Comment


                    • #30
                      Originally posted by mugginz View Post
                      This is with 3 x 24" LCDs, each @ their native resolutions of 1920x1200 with no rotation or anything fancy.
                      You're kidding, right?

                      Let's do the math for just precisely what you're doing there:

                      1920 x 1200 x 4 x 8 = 73728000 bytes for the raw framebuffer for one screen.

                      Approximately 70 Mb.

                      Times three is 210 Mb. That's raw framebuffer.

                      Now, double buffering's 420 Mb...

                      Add in any 3D mode operations such as 2X FSAA and you're at your 1Gb limit.

                      "Nothing fancy" indeed.

                      Looks like I'll have to dump this adaptor after all. Seems 1 GB isn't enough.
                      You're going to find...issues...with anything with 1Gb doing what you're trying to do there. You're needing 1.5-2Gb of card memory to do it, even with NVidia.

                      Comment

                      Working...
                      X