Announcement

Collapse
No announcement yet.

Two Hacks For The NVIDIA Linux Graphics Driver

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Two Hacks For The NVIDIA Linux Graphics Driver

    Phoronix: Two Hacks For The NVIDIA Linux Graphics Driver

    A Phoronix reader has shared two NVIDIA binary Linux graphics driver "hacks" he's written for overriding some functionality of the NVIDIA binary blob for GeForce hardware...

    http://www.phoronix.com/vr.php?view=MTQxNjM

  • #2
    What about the problem I heard that Nvidia cripples their hw (or sw?) when using doubles instead of floats? Can it be fixed only by moving to AMD?

    Comment


    • #3
      an artificial limit on the pixel clock
      I doubt it's an artificial limit. More likely it was the highest value that was validated for the hardware. Going beyond that limit may produce undefined results or non-compliant behavior.

      Comment


      • #4
        Originally posted by agd5f View Post
        I doubt it's an artificial limit. More likely it was the highest value that was validated for the hardware. Going beyond that limit may produce undefined results or non-compliant behavior.
        Overclocking can also cause abnormal behavior as well but these hacks can allow gamers and enthusiasts to push their hardware further at their own risk

        Comment


        • #5
          Originally posted by DeepDayze View Post
          Overclocking can also cause abnormal behavior as well but these hacks can allow gamers and enthusiasts to push their hardware further at their own risk
          Right, but in both cases, the limits are not artificial. That is the distinction I'm trying to make. Artificial implies that the limit is arbitrary and less than some "real" limit. By overclocking you are going beyond the validated limits of the hardware into undefined territory.

          Comment


          • #6
            Originally posted by agd5f View Post
            Right, but in both cases, the limits are not artificial. That is the distinction I'm trying to make. Artificial implies that the limit is arbitrary and less than some "real" limit. By overclocking you are going beyond the validated limits of the hardware into undefined territory.
            I think the limit used to be real... When I tried this on my old 9800 GTX+ rig, dmesg spewed out some sort of hardware error if I tried to request a clock even 1 KHz above the 400 MHz limit.

            Now on Kepler (and maybe Fermi as well?) the hardware allows setting pixel clocks greater than 400 MHz - I have my monitors at 462.84 without a single problem - but the kernel driver will refuse to generate timings if you ask for a clock greater than that... probably a relic from the Tesla days.

            What's silly is the fact that they even bothered to put a pixel clock limit in the kernel driver. The X driver already checks to make sure your pixel clock doesn't exceed 330 MHz, but allows you to disable it with the "NoMaxPClkCheck" ModeValidation option. Anyone trying to set more than 400 MHz will have to include that option in their xorg.conf anyway, so they'll obviously be aware that they're going past the DVI spec already.

            Comment


            • #7
              Originally posted by CFSworks View Post
              Now on Kepler (and maybe Fermi as well?) the hardware allows setting pixel clocks greater than 400 MHz - I have my monitors at 462.84 without a single problem - but the kernel driver will refuse to generate timings if you ask for a clock greater than that... probably a relic from the Tesla days.
              What monitors are you using? Is there a list of monitors that will accept high-rate signals at high resolution? Ever since I switched from CRTs to LCDs, I've longed for refresh rates greater than 60Hz at maximum resolution. I really miss running at 120-150Hz on a CRT.

              Comment


              • #8
                Refresh rates on LCDs are a bit different. The duty cycle is a lot different. If you are familiar with square waves you'll know what I'm talking about. Each pixel is "lit" for a longer portion of the cycle than an old CRT would be. As such the refresh rate on LCDs is less important or rather it is equivalent to a higher rate.

                Comment


                • #9
                  Originally posted by duby229 View Post
                  Refresh rates on LCDs are a bit different. The duty cycle is a lot different. If you are familiar with square waves you'll know what I'm talking about. Each pixel is "lit" for a longer portion of the cycle than an old CRT would be. As such the refresh rate on LCDs is less important or rather it is equivalent to a higher rate.
                  I do understand the differences between LCDs and CRTs. The scan rate from a video card is still called "refresh" rate, though, and I want moar framez.

                  Comment


                  • #10
                    Originally posted by unix_epoch View Post
                    What monitors are you using? Is there a list of monitors that will accept high-rate signals at high resolution? Ever since I switched from CRTs to LCDs, I've longed for refresh rates greater than 60Hz at maximum resolution. I really miss running at 120-150Hz on a CRT.

                    Comment


                    • #11
                      Originally posted by unix_epoch View Post
                      What monitors are you using? Is there a list of monitors that will accept high-rate signals at high resolution? Ever since I switched from CRTs to LCDs, I've longed for refresh rates greater than 60Hz at maximum resolution. I really miss running at 120-150Hz on a CRT.
                      While there's no 1440p monitor that will advertise support for 120Hz, since it's way outside of the DVI spec, there are a few monitors that have been found capable of overclocking that high. I'd suggest looking around on the 120hz.net forums for such a list. Keep in mind, however, that like all overclocking, it's a matter of luck: If you buy such a monitor, there's no guarantee that it'll handle 120Hz with no problems.

                      Personally, I use a pair of QNIX QX2710's. They're nice, fairly cheap monitors and I'm able to do 120Hz on them pretty well, with only two drawbacks: One of them tends to buzz when displaying certain images at 120Hz (doesn't happen at 60Hz), which is fixable, but I'll have to disassemble the monitor fix it. The other drawback is that they have a slight image persistence problem at 120Hz.

                      Also, I have a friend with a Yamakasi Catleap Q270 2B (the 2C will not handle 120Hz) and he says it works pretty great.

                      We both bought ours through eBay user green-sum. He's friendly and ships fast.

                      Comment


                      • #12
                        Originally posted by unix_epoch View Post
                        I want moar framez.
                        Why? If you have a 60hz LCD it is only able to show 60fps (so he skips anything above anyway). So your only "benefit" is more work for the GPU -> more power consumption and shorter life time.

                        Comment


                        • #13
                          Originally posted by unix_epoch View Post
                          What monitors are you using? Is there a list of monitors that will accept high-rate signals at high resolution? Ever since I switched from CRTs to LCDs, I've longed for refresh rates greater than 60Hz at maximum resolution. I really miss running at 120-150Hz on a CRT.
                          While there's no 1440p monitor that will advertise support for 120Hz, since it's way outside of the DVI spec, there are a few monitors that have been found capable of overclocking that high. I'd suggest looking around on the 120hz.net forums for such a list. Keep in mind, however, that like all overclocking, it's a matter of luck: If you buy such a monitor, there's no guarantee that it'll handle 120Hz with no problems.

                          Personally, I use a pair of QNIX QX2710's. They're nice, fairly cheap monitors and I'm able to do 120Hz on them pretty well, with only two drawbacks: One of them tends to buzz when displaying certain images at 120Hz (doesn't happen at 60Hz), which is fixable, but I'll have to disassemble the monitor fix it. The other drawback is that they have a slight image persistence problem at 120Hz.

                          Also, I have a friend with a Yamakasi Catleap Q270 2B (the 2C will not handle 120Hz) and he says it works pretty great.

                          We both bought ours through eBay user green-sum. He's friendly and ships fast.

                          (As an FYI, both monitors I mentioned have incorrect EDID checksums, for some reason - you'll have to configure your display server to ignore that or the monitor will not be detected correctly.)

                          Comment


                          • #14
                            Finally! I was so jealous windows users were getting 120hz at 1440p and I was stuck at 96hz due to this limit

                            Comment


                            • #15
                              Originally posted by duby229 View Post
                              Refresh rates on LCDs are a bit different. The duty cycle is a lot different. If you are familiar with square waves you'll know what I'm talking about. Each pixel is "lit" for a longer portion of the cycle than an old CRT would be. As such the refresh rate on LCDs is less important or rather it is equivalent to a higher rate.
                              I am pretty sure that the pixels of LCD monitors remain "lit" continuously without regard to the refresh rate of the signal. This is, in my opinion, what makes them so much better than CRTs and plasmas because they never flicker, even at "low" refresh rates like 60hz.

                              Comment

                              Working...
                              X