Announcement

Collapse
No announcement yet.

Has AMD Finally Fixed Tearing With Its Linux Driver?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by glisse View Post
    Maybe i missed something but from where does came from the factor 8 ? 32bits color buffer ie RGBA8888 is taking :
    1920x1200x4 = 9216000 so lets say 10Mbytes of memory

    It seems you were giving Mbits number, card memory are given in byte last time i heard
    Hm... (Note to self: Don't try zinging off a response before going to a meeting...you make dumb mistakes like this.)

    You're right... But he's still got insufficient memory for attempting a triple-buffered configuration. I just got the calculations off. He's still going to probably need 1.5 Gb at least. Let me mull over the best way to express what I'm trying to say here and I'll put it up here in a bit.

    Comment


    • #42
      Originally posted by BlackStar View Post
      Yeah, right, you wish. All my nvidia cards tear like hell as soon as I disable compiz/kwin or when I rotate my monitor.
      Originally posted by grigi View Post
      Agreed, on my notebook with Nvidia binary drivers, it tears really bad. I also don't run compiz or similar (because it still gets slower over time, so Nvidia hasn't fixed their memory leak issue yet...)
      The only cool thing of Nvidia is how well vdpau is supported... Then again, VAAPI support is becoming better by the day.
      It's always hilarious to hear people tell you how nvidia doesn't have issues - until someone who actually uses nvidia under Linux shows up.

      Comment


      • #43
        Originally posted by Joe Sixpack View Post
        It's always hilarious to hear people tell you how nvidia doesn't have issues - until someone who actually uses nvidia under Linux shows up.
        Nobody denies that nvidia has issues, they just have less issues the the alternatives and for the most part give a better experience then any of the others.

        Comment


        • #44
          Originally posted by Svartalf View Post
          Hm... (Note to self: Don't try zinging off a response before going to a meeting...you make dumb mistakes like this.)

          You're right... But he's still got insufficient memory for attempting a triple-buffered configuration. I just got the calculations off. He's still going to probably need 1.5 Gb at least. Let me mull over the best way to express what I'm trying to say here and I'll put it up here in a bit.
          Insufficient memory....

          1920x1200 pixels = 2304000 pixels

          @ 4 bytes per pixel make that 9216000 bytes yeh?

          divide by 1048576 for megabytes and I get the magic number of

          8.8 MBytes per frame buffer.

          For three screens that gives me 26.37 MBytes for frame store.
          Make that 79.10MBytes for a triple buffered configuration without rendering overhead.

          Now consider the "alternative" vendor which did provide a frame locked Compiz desktop with frame locked video playback.

          nVidia 9800GT 512M driving 2 x 1920x1200

          That's 512M driving 2/3 of the pixels the ATI's 1GB is driving.

          Going by nVidia's numbers there's the possibly of being able to drive four of those screens with 1GB.

          Am I missing something in my numbers?

          Comment


          • #45
            Question; Is it actually necessary to store the alpha channel?

            Comment


            • #46
              Originally posted by Silverthorn View Post
              Question; Is it actually necessary to store the alpha channel?
              I guess ultimately it'd depend on how they pack their pixels but if you simply want to change frames with the flick of a register then my guess is they probably would store the alpha channel.

              Comment


              • #47
                Maybe amd wants to sell new cards with more vram Usually only vram is splitted when you use a multi gpu card. I guess that's a programming error. Without official beta builds of the next driver you just have to wait several weeks till you can try again. I really hate that way because the error you found should have been found much earlier. But from the comments it seems that the new tearfree mode has got problems with powersave states (otherwise it would be unlogical that glxgears could help anything). I still will not use my amd card on my main system until those bugs are resolved. Basically my hd5670 would be faster than my gt220 but there are too many drawbacks still.

                Comment


                • #48
                  Getting back on topic: I finally got around to installing this new driver and enabling the "tear-free" thing.

                  I still get tearing on my desktop windows now that I've enabled the feature, but it seems to "fix" itself when I move the mouse cursor around. That part's new.

                  I hate my ATI HD5750. I only bought it because of the promise of adequate open source drivers in the near future, but I've hated it ever since I first installed it. I miss my old Nvidia card. It was outdated and slow, but my desktop always looked great. I wish I could afford to buy a new video card and get rid of this trash.

                  Comment


                  • #49
                    fuck blobs

                    Originally posted by GoremanX View Post
                    I hate my ATI HD5750. I only bought it because of the promise of adequate open source drivers in the near future, but I've hated it ever since I first installed it. I miss my old Nvidia card. It was outdated and slow, but my desktop always looked great. I wish I could afford to buy a new video card and get rid of this trash.
                    heh, isn't entire 5xxx series supposed to be already supported by open graphic stack ?
                    like, install kernel 2.6.37, or 36 if you can't; libdrm, xf86-video-ati, mesa from git and you're golden (have i said, it's a tear-free ?)! or maybe libdrm-2.4.23, xf86-video-ati-6.13.2, mesa-7.10 will suffice.
                    i bought my 4730 when r600g only started to emerge in private branches or something (and 5xxx only were going to hit some local shelves). hopefully r600c was solid and 2D in ddx were ok. maybe playing games is a bitch but desktop is uncrashable. you should be masochistic to use "official" driver still.

                    and now, i think i'll reboot main box in alternative OS and play some latest game in my native 1920x1080 resolution on my cheap-ass 100$ card. and, to hell with slow old nv7800gtx!

                    Comment


                    • #50
                      pretty short sighted of you. You want to support a company that does not support open source drivers. Smart move. Not

                      Comment

                      Working...
                      X