Announcement

Collapse
No announcement yet.

Has AMD Finally Fixed Tearing With Its Linux Driver?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    aticonfig --set-pcs-u32=DDX,EnableTearFreeDesktop,1

    Does this set something in the bios or driver? The reason I ask is I use multiple video cards in a multiseat arrangement. If its on the driver perhaps I would do this just once. If it set the bios, I might have update each card in an individual arrangement.

    Comment


    • #12
      Originally posted by jamey0824 View Post
      aticonfig --set-pcs-u32=DDX,EnableTearFreeDesktop,1

      Does this set something in the bios or driver? The reason I ask is I use multiple video cards in a multiseat arrangement. If its on the driver perhaps I would do this just once. If it set the bios, I might have update each card in an individual arrangement.
      The ATI "PCS" database is a key-value database that stores elementary types such as strings and integers. Why they opted to invent a proprietary database, I have no idea.

      Anyway, PCS strings apply to all cards/GPUs within the scope of a given instance of fglrx. So as long as all your cards are plugged into the same computer, running on the same (CPU) kernel, and powered by the same fglrx kernel module, then this setting will apply to them all. I don't think it is even possible to disable tear-free for some cards while enabling it for others.

      Comment


      • #13
        Originally posted by monraaf View Post
        No, tear-free video playback is something you'd expect from a graphics driver. It's like windshield wipers on a car. You don't go boasting how fantastic this 'new thing' called windshield wipers are on your ATI car. Other people will laugh at you because it's something they have always had on their car and just take for granted.

        Furthermore, Xv still has the wrong color space so you get washed out videos and GL has high-cpu usage issues when doing hardware colorspace conversion with some videos.

        So sorry to be the party breaker here but the truth is fglrx is still a hassle an pain in the ass for video playback.
        Car analogies are pretty irrelevant to the software world. Established standards about features and so on are only really applicable when you are dealing with computers that are designed and manufactured as appliances; i.e., Windows and Mac OS X on mass-produced Dell and Apple boxen running a stock configuration. If those didn't have tear-free video, then yes, it'd be analogous to going to the Ford dealership down the road and buying a Ford without windshield wipers.

        But you have to realize that it is extremely easy to wander outside the bounds of appliance-style normalcy in software. Just booting up the Linux kernel at all on anything short of a carrier-subsidized Android phone is the equivalent of building your own car from parts. So don't be surprised if you buy a hood unit that doesn't come with windshield wiper attachments.

        Setting our standards high and demanding rich features from the likes of AMD is going to be beneficial to us in the long run, but that doesn't mean you should get a sour attitude and say "they should've had this years ago!" when it finally arrives. It's here; it's an improvement; enjoy it.

        And as to your issues with Xv and CPU usage... I can't reproduce any of those problems here. HD5970 and a Core i7. I normally play video using Adobe Flash 10.2 "Square" or VLC's OpenGL plugin, and both of these have excellent video quality and the CPU usage is not a factor since I generally don't multitask while watching videos. I can even watch Netflix inside a VMware Windows VM without audio or video stuttering, and still tear-free.

        IMHO unless you are running a tiny underpowered laptop, there is no real reason to need hardware-accelerated video. Recent desktop CPUs are more than powerful enough to do audio software mixing, audio/video decoding and rendering at the same time. The computational complexity of video decoding hasn't ballooned nearly as quickly or dramatically as the demands of games, so it's pretty easy for me to simply not care whether it's done on the GPU or the CPU.

        Comment


        • #14
          Originally posted by allquixotic View Post
          The ATI "PCS" database is a key-value database that stores elementary types such as strings and integers. Why they opted to invent a proprietary database, I have no idea.

          Anyway, PCS strings apply to all cards/GPUs within the scope of a given instance of fglrx. So as long as all your cards are plugged into the same computer, running on the same (CPU) kernel, and powered by the same fglrx kernel module, then this setting will apply to them all. I don't think it is even possible to disable tear-free for some cards while enabling it for others.
          Wow, that actually explains a few problems I am having also. I have 4 seats in my multiseat computer. However, 2 of the 4 seats, I have two screens. And to enable big desktop you need to set Pcs screen with set-pcs-str="DDX,EnableRandR12,FALSE" anyways when you dot that it obviously jacks up the settings for the other seats. I really wish amd didn't assume that each card should have the same settings. Perhaps at the very minimum have a separate database for each card, or something.

          Comment


          • #15
            Originally posted by allquixotic View Post
            Car analogies are pretty irrelevant to the software world. Established standards about features and so on are only really applicable when you are dealing with computers that are designed and manufactured as appliances; i.e., Windows and Mac OS X on mass-produced Dell and Apple boxen running a stock configuration. If those didn't have tear-free video, then yes, it'd be analogous to going to the Ford dealership down the road and buying a Ford without windshield wipers.
            Bollocks. The fact is that Nvidia has managed to get tear-free playback with their Linux driver for god knows how long, and even the opensource ati driver has had it for quite some time now. It was just fglrx that didn't have it. So please spare me the nonsense about non stock configurations and whatever else excuses.

            Setting our standards high and demanding rich features from the likes of AMD is going to be beneficial to us in the long run, but that doesn't mean you should get a sour attitude and say "they should've had this years ago!" when it finally arrives. It's here; it's an improvement; enjoy it.
            A man walks into a restaurant and orders a bowl of soup. When the waiter delivers the soup to his table the man notices that there are at least a few dozen flies swimming in his soup. When the man complains to the waiter, the waiter fishes out one fly with a spoon and walks away. Now the man is not satisfied of course and wants to call the waiter again. Just as the man tries to raise his voice, allquixotic sitting on the table next to him interrupts:

            "Setting your standards high is a good thing, but there's no need to get such a sour attitude. Be happy, the waiter just fished one fly out of your soup, now Bon App?tit !"

            I never quite understood what drives people to become water carriers for ATI or insert any other company.

            And as to your issues with Xv and CPU usage... I can't reproduce any of those problems here.
            Fglrx uses a formula to convert yuv to rgb whereby it uses a target rgb color range of 16-235 instead of the 0-255 what computer monitors use. Saying you can't reproduce is like saying you can't reproduce 1+1=2. Maybe you can't see the difference, just like some people never could see tearing. But that's a different story.

            Comment


            • #16
              Originally posted by energyman View Post
              good thing I don't even realize that it is a pain in the ass. It works well for me.
              ignorance is a bliss, as they say... lucky bastard.

              Comment


              • #17
                Originally posted by monraaf View Post
                Bollocks. The fact is that Nvidia has managed to get tear-free playback with their Linux driver for god knows how long, and even the opensource ati driver has had it for quite some time now. It was just fglrx that didn't have it. So please spare me the nonsense about non stock configurations and whatever else excuses.
                Yeah, right, you wish. All my nvidia cards tear like hell as soon as I disable compiz/kwin or when I rotate my monitor.

                Comment


                • #18
                  Originally posted by monraaf View Post
                  A man walks into a restaurant and orders a bowl of soup. When the waiter delivers the soup to his table the man notices that there are at least a few dozen flies swimming in his soup. When the man complains to the waiter, the waiter fishes out one fly with a spoon and walks away. Now the man is not satisfied of course and wants to call the waiter again. Just as the man tries to raise his voice, allquixotic sitting on the table next to him interrupts:

                  "Setting your standards high is a good thing, but there's no need to get such a sour attitude. Be happy, the waiter just fished one fly out of your soup, now Bon App?tit !"
                  Aaaaaaaahahahaha

                  Comment


                  • #19
                    Originally posted by BlackStar View Post
                    Yeah, right, you wish. All my nvidia cards tear like hell as soon as I disable compiz/kwin or when I rotate my monitor.
                    Agreed, on my notebook with Nvidia binary drivers, it tears really bad. I also don't run compiz or similar (because it still gets slower over time, so Nvidia hasn't fixed their memory leak issue yet...)
                    The only cool thing of Nvidia is how well vdpau is supported... Then again, VAAPI support is becoming better by the day.

                    Comment


                    • #20
                      Originally posted by agd5f View Post
                      Wait for vline. You can stall the GPU until scanout has passed a certain line in the scanout image. The downside is, that the GPU has to wait until the vline passes so you're not rendering during that period. With multi-buffering, the GPU is always busy, but it takes more memory since you need a stack of buffers to cycle between to avoid rendering to the buffer being scanned out of.
                      Is there already a way do disable "wait for vline" for fullscreen apps without making modifications in radeon X driver or kernel drm module? What would I need (which kernel, driver, etc.) and how to do that?
                      Now I use kernel 2.6.36 and r300g driver + Mesa 7.11-git on xserver 1.9.0 and vblank_mode=0 does make difference just for glxgears, not for fullscreen FPS games.

                      Comment

                      Working...
                      X