Show Your Support: This site is primarily supported by advertisements. Ads are what have allowed this site to be maintained on a daily basis for the past 18+ years. We do our best to ensure only clean, relevant ads are shown, when any nasty ads are detected, we work to remove them ASAP. If you would like to view the site without ads while still supporting our work, please consider our ad-free Phoronix Premium.
NVIDIA Confirms Linux Driver Problems
Last week NVIDIA released the 295.40 Linux driver in order to address a high-risk security vulnerability that could allow hackers to gain access to the system memory via the GPU and the un-patched graphics driver. It turns out that the security fix is responsible for these weird issues now being experienced by a number of NVIDIA GeForce Linux users.
Fortunately, it turns out that the problem is namely affecting those with pre-G80 graphics hardware. For any GeForce 6 or GeForce 7 series graphics cards, or the GeForce 8800GTX and first-generation 8800GTS, are affected by these problems if upgrading the driver. The mainline NVIDIA Linux driver just goes back to the GeForce 6 series but their legacy drivers haven't been updated for this security fix.
"We have been made aware of an interaction problem between the fix contained in the newest release any any card with a chip older than G80, inclusive. This includes the full GeForce 6 and 7 series as well as GeForce 8800GTX and first-gen 8800GTS. We are actively working on resolving this issue and will provide an update as soon as possible. The symptoms can include graphical corruption, performance issues, crashes and temporary hangs. The release should be perfectly safe to use with more recent cards than that," writes a NVIDIA Linux engineer in this forum thread.
Separately, if you're noticing weird clock frequencies reported by the NVIDIA Linux driver on GeForce 600 "Kepler" series hardware, you're not alone. In my testing last week of the NVIDIA GeForce GTX 680 Linux driver with their binary driver, I noticed that PowerMizer and the "GPU3DClockFreqs" attribute for their control extension were reporting the GK104 GPU operating at 705MHz for its core. The GTX 680 should be topping out at 1006MHz.
At first it looked like it might have been a video BIOS issue with the MSI GeForce GTX 680, but it was a retail card and yesterday I received a message from NVIDIA's Andy Ritger when he got to the bottom of the situation. Andy's message is below.
GPU clock management changed significantly with Kepler, and not all of that is correctly reflected in nvidia-settings, yet. At this point, I believe what you are seeing is strictly due to nvidia-settings reporting deficiencies, rather than the driver not taking proper advantage of the GPU clocks.So for any Kepler owners running the NVIDIA Linux driver, the graphics card should be running at the correct frequencies, it just might not be reported that way for now. Coming up tomorrow will be the long-awaited NVIDIA GeForce GTX 680 Linux benchmarks and review.
For a little more detail:
* On Kepler, select clock domains, such as what nvidia-settings calls "graphics", have a range of possible clock values per Performance Level, and the driver dynamically adjusts within the range of possible values per clock per Performance Level.
* I think what is getting reported in nvidia-settings (both in the PowerMizer page and GPU3DClockFreqs) is the minimum value of each range.
You can be confident that the clock is certainly not running below what is reported in nvidia-settings, and it is likely running above that.
We'll work to get nvidia-settings updated to correctly report clock information on Kepler.