Announcement

Collapse
No announcement yet.

How do I lock performance mode to minimal on nvidia blob?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by not.sure View Post
    Voltages, yeah, it probably doesn't change those, but I'm not sure that's dangerous. Hard to imagine but hey, we've seen pigs fly.

    You should be able to make it 'one-button' by writing a script and then executing it in a startup script or binding it to some desktop button. Most if not all nvidia-settings options can be set via command line, check out nvidia-settings -q all.
    Way back when, I also used nvclock to do mess with frequencies, that was even easier. But nvclock probably doesn't support fermi chips yet.
    All this worked great over years for my FX570M (=GF8600M).
    Seems my only option left...
    I have 260 gtx(as mentioned in 1st post), but I dont want to burn it (yet)..
    Tanks for suggestions.

    Comment


    • #12
      Originally posted by deanjo View Post
      And you can buy a GTX-580 for $45? And BTW it is not a bug if it works as it was designed to do.

      (PS, power here is $0.08/kWh)
      You suggest to pay my electricity bills? Heh, pretty nice of you.

      Comment


      • #13
        One thing I used to do when I was a Nvidia user was to remove the PCI-E power plug from my gfx card whenever I wasn't planning on doing anything intensive. The card would stay in low power mode (<75W) and whenever I planned on using something that would require its full capabilities I would shutdown, plug the cable in, and restart. I had to add "--no-power-connector-check" to xorg.conf and that worked. Sadly ATI does not allow their cards to function without that cable despite their decreased power usage when compared to Nvidia cards

        Comment


        • #14
          Originally posted by Max Spain View Post
          One thing I used to do when I was a Nvidia user was to remove the PCI-E power plug from my gfx card whenever I wasn't planning on doing anything intensive. The card would stay in low power mode (<75W) and whenever I planned on using something that would require its full capabilities I would shutdown, plug the cable in, and restart. I had to add "--no-power-connector-check" to xorg.conf and that worked. Sadly ATI does not allow their cards to function without that cable despite their decreased power usage when compared to Nvidia cards
          I have two of them

          So far, I have tested the system with powerdraw measurement device and the results carried out better than expected. The powerconsumtion relate not very much to the driver state, but more to actuall load. So it managed to consume 113 watts(whole system, 80bronze psu) in idle mode and 130 watts when performance is forced.

          Ok, but when I load native quake 1 with newest shaders(darkplaces), the consumtion skyrockets to 185 watt peak.

          I guess its not big deal playing at 120 or 130, it just detects and scales the modes not so efficiently, a manual switch would be so much appreciated.

          Comment


          • #15
            Well, well, well, look at that, one of the first Google results for my query got me here. Obviously Phoronix is doing pretty well in the marketing department, which I believe--and hope--will translate into a generous revenue stream from all those totally unobtrusive ads I very rarely notice for I block the hell out of them. Not less obviously, I instantly knew I'd get nothing solved, but at least we can do as usual and rant some.

            Right, so I inherited a not-so-old laptop and I decided to put this Linux thing you people keep having arguments about. Looking good, installation wasn't as hard as I thought based on the great corpus of uninformed blogs, articles and news stories I have consumed during the last years. The wireless thing whose purpose I am told is to gang up with a wireless box my phone company sold me didn't want to work at first, but there's hardly anything a determined internet user can't possibly figure out. And look--I thought--since this machine has a Nvidia card, I'm going to be cool and install that award-winning green driver; I too want to experience the power of the binary-only side.

            Which only took me a couple of tries (it would have probably been even simpler had I opted for a so-called beginner-friendly distribution, but I managed just all right). So I launch the window with all the bits and bobs you can fiddle with, sweet...But hey, w-what's that? The card's locked into its maximum performance settings and the fan's deafening me; Jesus, I'm only trolling--and quite mildly so--, I don't want to imagine what it does when I actually try to do something on this machine. And look, you can't change it, it's all grayed out. I don't get it, how again is this any better to, say, the radeon open source driver in terms of power management? I know that one: it isn't, because I CAN change the power state in my other laptop and not be annoyed by this fucking noise.

            But it's OK. Because only over here would I learn that "it's not a bug if it works as it was designed to".

            Comment


            • #16
              Whats the card? Fermi?
              Basically all gfx suck more or less on linux nowadays in terms of smooth edges. And worst part is, its not due to linux.

              Comment

              Working...
              X