Announcement

Collapse
No announcement yet.

ATI's New Drivers: Did The Paradise Come?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    It's funny that although I've always used fglrx, I haven't really ever noticed any video playback issues, certainly not "I can't watch video". I tend to use mplayer, maybe that's why. And -vo gl.
    I wonder if that memory leaking stuff is inherent to the driver, I should try to recreate it and find out. Since installing the newer drivers, some games and other accelerated Gl programs have frozen my Ubuntu box. It might be either that or the heating Issue.

    Comment


    • #12
      I've had nothing but bad experiences with ATI since the mid-90s with Windows and Linux -- mostly bad driver support (TV cards, linux drivers) and poor display quality compared to Matrox and to a lesser extent nVidia. When I could, I have avoided their video cards since then. I'm still wondering why people use ATI, especially with Linux.

      Comment


      • #13
        Because they seem to have a hammer-lock on the laptop market (I've got a couple laptops with their stuff in it... One that's supported with Open Source drivers and one that's not- sadly, the one that's supported with fglrx is actually slower than the one that's not- but I suspect that's more a function of the lack of vertex shader support in that chip than the drivers themselves.).

        That, and many people are coming in from the Windows world and they're stuck with this stuff and don't have budget to support buying an NVidia. Couple this with a desire to support them now that they're slowly opening up all the info in spades- well, you've got the picture.

        Me, I'm not waiting for the driver to get done. I want their tech specs. If they can staff up and get their act together, fine- their driver will be faster because they've been up the food chain in 3D graphics longer than we have. However, until they get it stable, it's just not going to be an option for most people- period.

        Comment


        • #14
          Using Mobility 9600 (M10)

          With 8.42.3, I lost the Render to texture extension (GL_render_texture I think) compared to previous drivers. Along with the logo artifacting at the bottom right. The are the two most obvious and annoying...

          Comment


          • #15
            Thanks for the comment on the memory leak; I was simply disbelieving my eyes. I first noticed it in quake3: it leaks nearly one GB in ~45 minutes! I also can confirm the leak with glxgears; again, it is of the order of MB/s ! Quite an impressive example of bad coding... I really hope this gets fixed in the next revision.

            Comment


            • #16
              I'm not seeing any memory leak. Nothing as extreme as reported here at least.

              I just ran glxgears for a couple of minutes and its memory consumption was constant.

              Maybe it depends on the hardware? I'm using a 9550.

              Comment


              • #17
                I'm not seeing any mem leak here..I'm running an x800pro agp though.
                glxgears mem usage remains a steady 1.2% fgl_glxgears is a constant 1.4% both run for 15 min and observed with top... I've run UT2K4 under compiz-fusion for over an hour no mem leaks there either ..compiz-fusion had been up for over a week and I still had plenty of free ram..
                Those who would give up Essential Liberty to purchase a little Temporary Safety,deserve neither Liberty nor Safety.
                Ben Franklin 1755

                Comment


                • #18
                  Originally posted by From the article
                  The second problem is with AIGLX support. After this months driver release, folks around the globe were dancing happily because the latest drivers were successful in initializing Compiz, Beryl, and friends. I'm an eye-candy lover but I believe that it should be subtle and should be used to increase the productivity of the end user. In my opinion's light, I find KDE's internal composition is sufficient and neat so, I enabled them but, unfortunately, if I understand it correctly, while AIGLX is implemented and working, it's lacking composition support and hence, all of the KDE desktop was rendered by my CPU with the expense of extreme CPU usage and slowness. As a result, I turned it off and my eye-candy quest has ended even before starting, again...
                  IIRC, aren't KDE3's "compositing effects" software driven? I didn't think they were capable of using AIGLX or a true compositing WM properly (and therefor getting the video hardware to do the work).

                  Also, there are indeed incompatibilities between Compiz/Beryl and ATI's drivers. But you can't put the blame squarely on one or the other. C-F does some squirrelly things, and ATI shouldn't add squirrelly code to cover for those.

                  That all said, this has been a major code rewrite, and with it comes a lot of bugs. Lets see if ATI cleans it up, because the rewrite has major promise in benchmarks and is anything but a trainwreck so far... but it could end up there if they don't start fixing bugs.

                  Comment


                  • #19
                    Hi all,

                    I'm Hakan, the writer of the article and errm so, here are some replies...

                    1- @DarkFoss && @chefkoch: It seems like memory leak is "generation" dependent since driver design for such devices require several code sections which are special for some hardware.

                    2- @igknighted: No. XComposite & KDE Translucency is HW accelerateable and KDE uses it. nVidia cards accelerate it very well. I use the same OS (I cloned my home PC to office) @ office with an 7600GT and Use Pardus with KDE on geForce Go 7200 on my laptop. Both use the effects without any CPU usage. Ah, FYI, KDE has software composite and XComposite is SW emulated if no accelerating HW is found.

                    Edit: Added more info in "2" and fized some typos.
                    Last edited by Silent Storm; 15 November 2007, 03:20 AM.

                    Comment


                    • #20
                      Originally posted by Silent Storm View Post
                      @igknighted: No. XComposite & KDE Translucency is HW accelerateable and KDE uses it. nVidia cards accelerate it very well. I use the same OS (I cloned my home PC to office) @ office with an 7600GT and Use Pardus with KDE on geForce Go 7200 on my laptop. Both use the effects without any CPU usage. Ah, FYI, KDE has software composite and XComposite is SW emulated if no accelerating HW is found.
                      Ahh, thanks. I guess I'll have to give those effects another shot...

                      Comment

                      Working...
                      X