Announcement

Collapse
No announcement yet.

Has AMD Finally Fixed Tearing With Its Linux Driver?

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by Svartalf View Post
    1920 x 1200 x 4 x 8 = 73728000 bytes for the raw framebuffer for one screen.
    Maybe i missed something but from where does came from the factor 8 ? 32bits color buffer ie RGBA8888 is taking :
    1920x1200x4 = 9216000 so lets say 10Mbytes of memory

    It seems you were giving Mbits number, card memory are given in byte last time i heard

    Comment


    • #42
      Originally posted by BlackStar View Post
      It's worse than that, since tearfree is (presumably) using triple-buffering, which means 730MB just for the color buffers. Add the necessary z-buffers (another 730MB) and the out-of-memory warning starts making sense now, don't they?

      No wonder that the 6950/6970 cards come with 2GB memory.
      Heh... I knew I'd understated something somewhere... >:-D

      Simply put...if you're doing more than one display, you're doing something a bit more aggressive than you think.

      Comment


      • #43
        Originally posted by glisse View Post
        Maybe i missed something but from where does came from the factor 8 ? 32bits color buffer ie RGBA8888 is taking :
        1920x1200x4 = 9216000 so lets say 10Mbytes of memory

        It seems you were giving Mbits number, card memory are given in byte last time i heard
        Hm... (Note to self: Don't try zinging off a response before going to a meeting...you make dumb mistakes like this.)

        You're right... But he's still got insufficient memory for attempting a triple-buffered configuration. I just got the calculations off. He's still going to probably need 1.5 Gb at least. Let me mull over the best way to express what I'm trying to say here and I'll put it up here in a bit.

        Comment


        • #44
          Originally posted by BlackStar View Post
          Yeah, right, you wish. All my nvidia cards tear like hell as soon as I disable compiz/kwin or when I rotate my monitor.
          Originally posted by grigi View Post
          Agreed, on my notebook with Nvidia binary drivers, it tears really bad. I also don't run compiz or similar (because it still gets slower over time, so Nvidia hasn't fixed their memory leak issue yet...)
          The only cool thing of Nvidia is how well vdpau is supported... Then again, VAAPI support is becoming better by the day.
          It's always hilarious to hear people tell you how nvidia doesn't have issues - until someone who actually uses nvidia under Linux shows up.

          Comment


          • #45
            Originally posted by Joe Sixpack View Post
            It's always hilarious to hear people tell you how nvidia doesn't have issues - until someone who actually uses nvidia under Linux shows up.
            Nobody denies that nvidia has issues, they just have less issues the the alternatives and for the most part give a better experience then any of the others.

            Comment


            • #46
              Originally posted by Svartalf View Post
              Hm... (Note to self: Don't try zinging off a response before going to a meeting...you make dumb mistakes like this.)

              You're right... But he's still got insufficient memory for attempting a triple-buffered configuration. I just got the calculations off. He's still going to probably need 1.5 Gb at least. Let me mull over the best way to express what I'm trying to say here and I'll put it up here in a bit.
              Insufficient memory....

              1920x1200 pixels = 2304000 pixels

              @ 4 bytes per pixel make that 9216000 bytes yeh?

              divide by 1048576 for megabytes and I get the magic number of

              8.8 MBytes per frame buffer.

              For three screens that gives me 26.37 MBytes for frame store.
              Make that 79.10MBytes for a triple buffered configuration without rendering overhead.

              Now consider the "alternative" vendor which did provide a frame locked Compiz desktop with frame locked video playback.

              nVidia 9800GT 512M driving 2 x 1920x1200

              That's 512M driving 2/3 of the pixels the ATI's 1GB is driving.

              Going by nVidia's numbers there's the possibly of being able to drive four of those screens with 1GB.

              Am I missing something in my numbers?

              Comment


              • #47
                Question; Is it actually necessary to store the alpha channel?

                Comment


                • #48
                  Originally posted by Silverthorn View Post
                  Question; Is it actually necessary to store the alpha channel?
                  I guess ultimately it'd depend on how they pack their pixels but if you simply want to change frames with the flick of a register then my guess is they probably would store the alpha channel.

                  Comment


                  • #49
                    Maybe amd wants to sell new cards with more vram Usually only vram is splitted when you use a multi gpu card. I guess that's a programming error. Without official beta builds of the next driver you just have to wait several weeks till you can try again. I really hate that way because the error you found should have been found much earlier. But from the comments it seems that the new tearfree mode has got problems with powersave states (otherwise it would be unlogical that glxgears could help anything). I still will not use my amd card on my main system until those bugs are resolved. Basically my hd5670 would be faster than my gt220 but there are too many drawbacks still.

                    Comment


                    • #50
                      Getting back on topic: I finally got around to installing this new driver and enabling the "tear-free" thing.

                      I still get tearing on my desktop windows now that I've enabled the feature, but it seems to "fix" itself when I move the mouse cursor around. That part's new.

                      I hate my ATI HD5750. I only bought it because of the promise of adequate open source drivers in the near future, but I've hated it ever since I first installed it. I miss my old Nvidia card. It was outdated and slow, but my desktop always looked great. I wish I could afford to buy a new video card and get rid of this trash.

                      Comment

                      Working...
                      X