If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Yes, mostly. I'm not sure whether RC6 defaults on yet, but if you follow the Intel news here you'll know.
From what I've read, it sounds like an Intel mobile gpu isn't bad.
I was just curious how Nvidia Optimus w/ Bumblebee or equivalent would compare? Maybe it's best to have Nvidia discrete w/ Intel integrated and have the Nvidia one disabled unless you need it? I mean, if power saving is the priority, you'd do that? Bumblebee primarily has this function? I was wondering what the update is regarding how well it works and what the general satisfaction level might be.
There actually seems to be quite a few Optimus laptops that are price competitive with the cheaper Intel mobile gpu laptops (of comparable hardware).
Not even considering an ATI mobile gpu laptop at all.
Even though I was told the card will arrive tomorrow, it got here today. Yey!
I built a kernel without DRI/DRM drivers, then installed the nvidia driver (295.17, using the Gentoo ebuild for 290.10) and configured an X "Device" section to load "nvidia". The desktop (KDE 4.8) works normally. Differences I noticed so far:
The very annoying lag you get with Catalyst when you enable VSync is gone completely. It seems NVidia has an implementation of desktop VSync that is actually usable. The "radeon" open source driver also didn't suffer from this problem, but it had another one: it would drop a frame or two every second. I thought it was a KDE problem, but the NVidia blob doesn't have this problem, so it seems the bug was in the X.Org stack?
Xv works perfectly, both in windowed as well in fullscreen mode. It is tear-free, and also the colors are correct and not washed out like with Catalyst.
VDPAU works perfectly in fullscreen mode. It does not work well in windowed mode; the whole KDE desktop, including the mouse cursor, starts stuttering when mplayer2 is not in fullscreen mode.
Adobe Flash (126.96.36.199 64-bit) uses hardware accelerated video decoding now (needs "EnableLinuxHWVideoDecode=1" in /etc/adobe/mms.cfg). But it still uses software rendering, just like with Catalyst and "radeon".
GPU temps are way down. About 43C compared to 70C with Catalyst and 80C with "radeon". This is due to the hardware, of course, not the driver (with the exception of "radeon" which wasn't good at this, giving 10C more than Catalyst.)
I can do my CUDA development natively now without needing gpuocelot Gpuocelot is a very impressive and useful piece of software, but hey, native is better, right?
Now to the bad stuff:
I lost my 1920x1080 framebuffer console. It seems that NVidia only supports up to 1280x1024 for the framebuffer?
Power management through "Powermizer" is way too aggressive. The card clocks down to 50Mhz (!!) when it's idle for a bit, but is not quick enough to clock the card back up again, resulting in jerky animations in the desktop. And sometimes it doesn't want to power-up again at all unless I run an OpenGL application. The way to fix that is by introducing custom Powermizer policies in xorg.conf, which is not very user friendly and looks like an annoying bug to me.
As mentioned above, VDPAU has a problem in windowed mode. Anyone knows what's going on with that?
Flash says it uses the GPU for decoding, but not for the actual rendering. Is there a way to tell Flash to use Xv/OpenGL/whatever to display the video?
And the irrelevant stuff (because it's on Windows):
These are my impressions after about 3 hours of using the GTX 560 Ti.
I think I saw something on the nvidia forums about that VDPAU issue. If it's any consolation, I don't have that problem on compiz / gnome2 (driver 290.10 though), so maybe there is some kind of KDE setting?
And yeah, there is no nvidia framebuffer, so you have to use the vesa modes. The devs mentioned maybe coming up with fb support but decided it wasn't worth the effort.
If you add Option "Coolbits" "5" to the "Device" section in the xorg.conf you might get some more PowerMizer options that might help (though I think you've found this already). I kinda have the opposite problem... my card is always at max performance, either because I have a composited desktop or because I'm driving two displays.
I've been using ATI cards since almost 10 years (a Radeon 7500 was the first.)
HAHAHA.. Do you remember when the VGA and the S-Video outputs got swapped in the driver for the Radeon 7000 series one day (around 2002)? That was funny.. My SVGA output wouldn't run anything other than 640x480 and the S-Video was supposedly fully capable of 1280x1024 but would just hard-lock if you tried it.... and on top of that, the mistake somehow ended up on the "stable" release of the driver and I picked it up in Debian Testing... Yea, I've always had bad experiences with the ATI drivers ever since the very beginning..
That's why I prefer to run proprietary drivers.. Much less likely something totally wonky will happen. I hate to bash the open source drivers (including Gallium), but the reality is they likely will never get the features or performance of the proprietary drivers. No MSAA and No stable Hyper-Z for r300g. AMD needs to put more people on their open source drivers and stop focusing so much on Catalyst.
I'd switch over to nvidia, but they're even WORSE.. No support for Optimus under Linux.. I'll just stick with AMD + Catalyst.
That's one of the biggest problems with Linux fanboys. You can't have 2% market share on desktops and act like the rest of the world doesn't exist because it's closed source.
And any manufacturer should bend over backwards to support stuff people use.
Linux fanboys? I'm a GNU/Linux user and i'm a free software advocate, microsoft isn't even in competition with GNU as the software that they produce is closed and therefore unethical
so 'the rest of the world' is of no consequence to me and shouldn't be to anyone who understands the ethical development of software, this is not fanboy-ism
fglrx works fine for me with my Radeon HD 2400 PRO AGP even with xvba as does my Radeon HD 4650 even with 'high end games'
I've had many nvidia cards and used the nvidia blob with all of them and they worked pretty well too
the problem ( as far as i can see ) is the difference between what AMD's idea of opengl is and what nvidia's idea of what opengl is.. I'm assuming this is why nvidia cards work better with wine as most of the wined3d development has been done using nvidia
anyway i'm switching to the free AMD driver very soon as vdpau is close to working properly
[list][*]I lost my 1920x1080 framebuffer console. It seems that NVidia only supports up to 1280x1024 for the framebuffer?
Check this out. It may still be possible to set a 1920x1080 framebuffer, but performance might be very low (something with nvidia not supporting high-res vesa modes natively or whatever).
[*]Power management through "Powermizer" is way too aggressive. The card clocks down to 50Mhz (!!) when it's idle for a bit, but is not quick enough to clock the card back up again, resulting in jerky animations in the desktop. And sometimes it doesn't want to power-up again at all unless I run an OpenGL application. The way to fix that is by introducing custom Powermizer policies in xorg.conf, which is not very user friendly and looks like an annoying bug to me.
That's by design and, sadly, there's no solution without increasing power consumption. Three workarounds: connect a second monitor (the lowest pm will be disabled), flash different clocks to the video bios or use a custom pm policy.
Interestingly, Win7 works smoothly even on the lowest pm mode. More optimized drivers or more efficient compositor, maybe?
[*]As mentioned above, VDPAU has a problem in windowed mode. Anyone knows what's going on with that?
VDPAU tends to work ok, but it's always had some rough corners. This is one.
I'd actually advise disabling it for windowed applications, provided your CPU is fast enough. It's great for fullscreen apps though.
[*]Flash says it uses the GPU for decoding, but not for the actual rendering. Is there a way to tell Flash to use Xv/OpenGL/whatever to display the video?
Not anymore, sorry.
Flash uses pure software rendering as of 11.2, with no way to enable hardware rendering. Previous versions had a toggleable setting in /etc/adobe/mms.cfg
(Edit: I've actually removed flash completely on my netbook, as its CPU can't handle software rendering. HTML5 video is significantly faster there!)