Announcement

Collapse
No announcement yet.

Goodbye ATI

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #81
    Originally posted by curaga View Post
    Yes, mostly. I'm not sure whether RC6 defaults on yet, but if you follow the Intel news here you'll know.
    From what I've read, it sounds like an Intel mobile gpu isn't bad.

    I was just curious how Nvidia Optimus w/ Bumblebee or equivalent would compare? Maybe it's best to have Nvidia discrete w/ Intel integrated and have the Nvidia one disabled unless you need it? I mean, if power saving is the priority, you'd do that? Bumblebee primarily has this function? I was wondering what the update is regarding how well it works and what the general satisfaction level might be.

    There actually seems to be quite a few Optimus laptops that are price competitive with the cheaper Intel mobile gpu laptops (of comparable hardware).

    Not even considering an ATI mobile gpu laptop at all.

    Comment


    • #82
      Even though I was told the card will arrive tomorrow, it got here today. Yey!

      I built a kernel without DRI/DRM drivers, then installed the nvidia driver (295.17, using the Gentoo ebuild for 290.10) and configured an X "Device" section to load "nvidia". The desktop (KDE 4.8) works normally. Differences I noticed so far:

      • The very annoying lag you get with Catalyst when you enable VSync is gone completely. It seems NVidia has an implementation of desktop VSync that is actually usable. The "radeon" open source driver also didn't suffer from this problem, but it had another one: it would drop a frame or two every second. I thought it was a KDE problem, but the NVidia blob doesn't have this problem, so it seems the bug was in the X.Org stack?

      • Xv works perfectly, both in windowed as well in fullscreen mode. It is tear-free, and also the colors are correct and not washed out like with Catalyst.

      • VDPAU works perfectly in fullscreen mode. It does not work well in windowed mode; the whole KDE desktop, including the mouse cursor, starts stuttering when mplayer2 is not in fullscreen mode.

      • Adobe Flash (11.1.102.55 64-bit) uses hardware accelerated video decoding now (needs "EnableLinuxHWVideoDecode=1" in /etc/adobe/mms.cfg). But it still uses software rendering, just like with Catalyst and "radeon".

      • GPU temps are way down. About 43C compared to 70C with Catalyst and 80C with "radeon". This is due to the hardware, of course, not the driver (with the exception of "radeon" which wasn't good at this, giving 10C more than Catalyst.)

      • I can do my CUDA development natively now without needing gpuocelot Gpuocelot is a very impressive and useful piece of software, but hey, native is better, right?


      Now to the bad stuff:

      • I lost my 1920x1080 framebuffer console. It seems that NVidia only supports up to 1280x1024 for the framebuffer?

      • Power management through "Powermizer" is way too aggressive. The card clocks down to 50Mhz (!!) when it's idle for a bit, but is not quick enough to clock the card back up again, resulting in jerky animations in the desktop. And sometimes it doesn't want to power-up again at all unless I run an OpenGL application. The way to fix that is by introducing custom Powermizer policies in xorg.conf, which is not very user friendly and looks like an annoying bug to me.

      • As mentioned above, VDPAU has a problem in windowed mode. Anyone knows what's going on with that?

      • Flash says it uses the GPU for decoding, but not for the actual rendering. Is there a way to tell Flash to use Xv/OpenGL/whatever to display the video?


      And the irrelevant stuff (because it's on Windows):

      • PhysX yey!!


      These are my impressions after about 3 hours of using the GTX 560 Ti.
      Last edited by RealNC; 10 February 2012, 11:53 AM.

      Comment


      • #83
        I think I saw something on the nvidia forums about that VDPAU issue. If it's any consolation, I don't have that problem on compiz / gnome2 (driver 290.10 though), so maybe there is some kind of KDE setting?

        And yeah, there is no nvidia framebuffer, so you have to use the vesa modes. The devs mentioned maybe coming up with fb support but decided it wasn't worth the effort.

        If you add Option "Coolbits" "5" to the "Device" section in the xorg.conf you might get some more PowerMizer options that might help (though I think you've found this already). I kinda have the opposite problem... my card is always at max performance, either because I have a composited desktop or because I'm driving two displays.
        Last edited by johnc; 10 February 2012, 12:24 PM.

        Comment


        • #84
          Originally posted by johnc View Post
          I kinda have the opposite problem... my card is always at max performance, either because I have a composited desktop or because I'm driving two displays.
          Well, you can find out for sure by disabling compositing and removing the second display.

          Comment


          • #85
            2 monitors need of course more power - if they are with different res even more.

            Comment


            • #86
              Originally posted by RealNC View Post
              I've been using ATI cards since almost 10 years (a Radeon 7500 was the first.)
              HAHAHA.. Do you remember when the VGA and the S-Video outputs got swapped in the driver for the Radeon 7000 series one day (around 2002)? That was funny.. My SVGA output wouldn't run anything other than 640x480 and the S-Video was supposedly fully capable of 1280x1024 but would just hard-lock if you tried it.... and on top of that, the mistake somehow ended up on the "stable" release of the driver and I picked it up in Debian Testing... Yea, I've always had bad experiences with the ATI drivers ever since the very beginning..

              That's why I prefer to run proprietary drivers.. Much less likely something totally wonky will happen. I hate to bash the open source drivers (including Gallium), but the reality is they likely will never get the features or performance of the proprietary drivers. No MSAA and No stable Hyper-Z for r300g. AMD needs to put more people on their open source drivers and stop focusing so much on Catalyst.

              I'd switch over to nvidia, but they're even WORSE.. No support for Optimus under Linux.. I'll just stick with AMD + Catalyst.
              Last edited by Sidicas; 10 February 2012, 03:36 PM.

              Comment


              • #87
                Originally posted by RealNC View Post

                And the irrelevant stuff (because it's on Windows):
                • PhysX yey!!

                It is on Linux too. Wine works good, don?t forget that you can dump any windows now.

                Comment


                • #88
                  Originally posted by Sidicas View Post
                  I'd switch over to nvidia, but they're even WORSE.. No support for Optimus under Linux.. I'll just stick with AMD + Catalyst.
                  I don't even know what optimus is. Only thing I know is that it's something about laptops, which I'm not interested in and therefore don't even bother to look up any details about it :-P

                  Comment


                  • #89
                    Originally posted by Sidicas View Post
                    I'd switch over to nvidia, but they're even WORSE.. No support for Optimus under Linux.. I'll just stick with AMD + Catalyst.
                    What tech does AMD have that does what Optimus should do?

                    Comment


                    • #90
                      Originally posted by bug77 View Post
                      That's one of the biggest problems with Linux fanboys. You can't have 2% market share on desktops and act like the rest of the world doesn't exist because it's closed source.
                      And any manufacturer should bend over backwards to support stuff people use.
                      Linux fanboys? I'm a GNU/Linux user and i'm a free software advocate, microsoft isn't even in competition with GNU as the software that they produce is closed and therefore unethical

                      so 'the rest of the world' is of no consequence to me and shouldn't be to anyone who understands the ethical development of software, this is not fanboy-ism

                      Comment

                      Working...
                      X