Announcement

Collapse
No announcement yet.

Two Hacks For The NVIDIA Linux Graphics Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Two Hacks For The NVIDIA Linux Graphics Driver

    Phoronix: Two Hacks For The NVIDIA Linux Graphics Driver

    A Phoronix reader has shared two NVIDIA binary Linux graphics driver "hacks" he's written for overriding some functionality of the NVIDIA binary blob for GeForce hardware...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    What about the problem I heard that Nvidia cripples their hw (or sw?) when using doubles instead of floats? Can it be fixed only by moving to AMD?

    Comment


    • #3
      an artificial limit on the pixel clock
      I doubt it's an artificial limit. More likely it was the highest value that was validated for the hardware. Going beyond that limit may produce undefined results or non-compliant behavior.

      Comment


      • #4
        Originally posted by agd5f View Post
        I doubt it's an artificial limit. More likely it was the highest value that was validated for the hardware. Going beyond that limit may produce undefined results or non-compliant behavior.
        Overclocking can also cause abnormal behavior as well but these hacks can allow gamers and enthusiasts to push their hardware further at their own risk

        Comment


        • #5
          Originally posted by DeepDayze View Post
          Overclocking can also cause abnormal behavior as well but these hacks can allow gamers and enthusiasts to push their hardware further at their own risk
          Right, but in both cases, the limits are not artificial. That is the distinction I'm trying to make. Artificial implies that the limit is arbitrary and less than some "real" limit. By overclocking you are going beyond the validated limits of the hardware into undefined territory.

          Comment


          • #6
            Originally posted by agd5f View Post
            Right, but in both cases, the limits are not artificial. That is the distinction I'm trying to make. Artificial implies that the limit is arbitrary and less than some "real" limit. By overclocking you are going beyond the validated limits of the hardware into undefined territory.
            I think the limit used to be real... When I tried this on my old 9800 GTX+ rig, dmesg spewed out some sort of hardware error if I tried to request a clock even 1 KHz above the 400 MHz limit.

            Now on Kepler (and maybe Fermi as well?) the hardware allows setting pixel clocks greater than 400 MHz - I have my monitors at 462.84 without a single problem - but the kernel driver will refuse to generate timings if you ask for a clock greater than that... probably a relic from the Tesla days.

            What's silly is the fact that they even bothered to put a pixel clock limit in the kernel driver. The X driver already checks to make sure your pixel clock doesn't exceed 330 MHz, but allows you to disable it with the "NoMaxPClkCheck" ModeValidation option. Anyone trying to set more than 400 MHz will have to include that option in their xorg.conf anyway, so they'll obviously be aware that they're going past the DVI spec already.

            Comment


            • #7
              Originally posted by CFSworks View Post
              Now on Kepler (and maybe Fermi as well?) the hardware allows setting pixel clocks greater than 400 MHz - I have my monitors at 462.84 without a single problem - but the kernel driver will refuse to generate timings if you ask for a clock greater than that... probably a relic from the Tesla days.
              What monitors are you using? Is there a list of monitors that will accept high-rate signals at high resolution? Ever since I switched from CRTs to LCDs, I've longed for refresh rates greater than 60Hz at maximum resolution. I really miss running at 120-150Hz on a CRT.

              Comment


              • #8
                Refresh rates on LCDs are a bit different. The duty cycle is a lot different. If you are familiar with square waves you'll know what I'm talking about. Each pixel is "lit" for a longer portion of the cycle than an old CRT would be. As such the refresh rate on LCDs is less important or rather it is equivalent to a higher rate.

                Comment


                • #9
                  Originally posted by duby229 View Post
                  Refresh rates on LCDs are a bit different. The duty cycle is a lot different. If you are familiar with square waves you'll know what I'm talking about. Each pixel is "lit" for a longer portion of the cycle than an old CRT would be. As such the refresh rate on LCDs is less important or rather it is equivalent to a higher rate.
                  I do understand the differences between LCDs and CRTs. The scan rate from a video card is still called "refresh" rate, though, and I want moar framez.

                  Comment


                  • #10
                    Originally posted by unix_epoch View Post
                    What monitors are you using? Is there a list of monitors that will accept high-rate signals at high resolution? Ever since I switched from CRTs to LCDs, I've longed for refresh rates greater than 60Hz at maximum resolution. I really miss running at 120-150Hz on a CRT.

                    Comment

                    Working...
                    X