If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Yes, mostly. I'm not sure whether RC6 defaults on yet, but if you follow the Intel news here you'll know.
From what I've read, it sounds like an Intel mobile gpu isn't bad.
I was just curious how Nvidia Optimus w/ Bumblebee or equivalent would compare? Maybe it's best to have Nvidia discrete w/ Intel integrated and have the Nvidia one disabled unless you need it? I mean, if power saving is the priority, you'd do that? Bumblebee primarily has this function? I was wondering what the update is regarding how well it works and what the general satisfaction level might be.
There actually seems to be quite a few Optimus laptops that are price competitive with the cheaper Intel mobile gpu laptops (of comparable hardware).
Not even considering an ATI mobile gpu laptop at all.
Even though I was told the card will arrive tomorrow, it got here today. Yey!
I built a kernel without DRI/DRM drivers, then installed the nvidia driver (295.17, using the Gentoo ebuild for 290.10) and configured an X "Device" section to load "nvidia". The desktop (KDE 4.8) works normally. Differences I noticed so far:
The very annoying lag you get with Catalyst when you enable VSync is gone completely. It seems NVidia has an implementation of desktop VSync that is actually usable. The "radeon" open source driver also didn't suffer from this problem, but it had another one: it would drop a frame or two every second. I thought it was a KDE problem, but the NVidia blob doesn't have this problem, so it seems the bug was in the X.Org stack?
Xv works perfectly, both in windowed as well in fullscreen mode. It is tear-free, and also the colors are correct and not washed out like with Catalyst.
VDPAU works perfectly in fullscreen mode. It does not work well in windowed mode; the whole KDE desktop, including the mouse cursor, starts stuttering when mplayer2 is not in fullscreen mode.
Adobe Flash (220.127.116.11 64-bit) uses hardware accelerated video decoding now (needs "EnableLinuxHWVideoDecode=1" in /etc/adobe/mms.cfg). But it still uses software rendering, just like with Catalyst and "radeon".
GPU temps are way down. About 43C compared to 70C with Catalyst and 80C with "radeon". This is due to the hardware, of course, not the driver (with the exception of "radeon" which wasn't good at this, giving 10C more than Catalyst.)
I can do my CUDA development natively now without needing gpuocelot Gpuocelot is a very impressive and useful piece of software, but hey, native is better, right?
Now to the bad stuff:
I lost my 1920x1080 framebuffer console. It seems that NVidia only supports up to 1280x1024 for the framebuffer?
Power management through "Powermizer" is way too aggressive. The card clocks down to 50Mhz (!!) when it's idle for a bit, but is not quick enough to clock the card back up again, resulting in jerky animations in the desktop. And sometimes it doesn't want to power-up again at all unless I run an OpenGL application. The way to fix that is by introducing custom Powermizer policies in xorg.conf, which is not very user friendly and looks like an annoying bug to me.
As mentioned above, VDPAU has a problem in windowed mode. Anyone knows what's going on with that?
Flash says it uses the GPU for decoding, but not for the actual rendering. Is there a way to tell Flash to use Xv/OpenGL/whatever to display the video?
And the irrelevant stuff (because it's on Windows):
These are my impressions after about 3 hours of using the GTX 560 Ti.
I think I saw something on the nvidia forums about that VDPAU issue. If it's any consolation, I don't have that problem on compiz / gnome2 (driver 290.10 though), so maybe there is some kind of KDE setting?
And yeah, there is no nvidia framebuffer, so you have to use the vesa modes. The devs mentioned maybe coming up with fb support but decided it wasn't worth the effort.
If you add Option "Coolbits" "5" to the "Device" section in the xorg.conf you might get some more PowerMizer options that might help (though I think you've found this already). I kinda have the opposite problem... my card is always at max performance, either because I have a composited desktop or because I'm driving two displays.
I've been using ATI cards since almost 10 years (a Radeon 7500 was the first.)
HAHAHA.. Do you remember when the VGA and the S-Video outputs got swapped in the driver for the Radeon 7000 series one day (around 2002)? That was funny.. My SVGA output wouldn't run anything other than 640x480 and the S-Video was supposedly fully capable of 1280x1024 but would just hard-lock if you tried it.... and on top of that, the mistake somehow ended up on the "stable" release of the driver and I picked it up in Debian Testing... Yea, I've always had bad experiences with the ATI drivers ever since the very beginning..
That's why I prefer to run proprietary drivers.. Much less likely something totally wonky will happen. I hate to bash the open source drivers (including Gallium), but the reality is they likely will never get the features or performance of the proprietary drivers. No MSAA and No stable Hyper-Z for r300g. AMD needs to put more people on their open source drivers and stop focusing so much on Catalyst.
I'd switch over to nvidia, but they're even WORSE.. No support for Optimus under Linux.. I'll just stick with AMD + Catalyst.