Announcement

Collapse
No announcement yet.

How do I lock performance mode to minimal on nvidia blob?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • How do I lock performance mode to minimal on nvidia blob?

    So, I have 260sp216 gtx card now and I try to play good old fallout 2.

    But every time I start it, the card goes wild into performance mode 2. This means, instead of drawing 40 watts from the outlet, it starts eating 150 watt.

    Yet, when I use kde4 kwin composite, it always stays at level 0.

    I tried turning composite off - no change. Every time 2d ddraw game launches in wine, it is propelled to max performance.

    Is there a way to lock the card temporarily to level 0? Preferably without reboot and without messing directly with frequencies.

    How does the driver determine it needs full load, why does it stay at 0 on composite, yet goes straight 2(max performance) when any ddraw wine app is launched? If I install firefox 3.5 in wine, performance level stays 0, even if I select opengl as renderer, instead of gdi...

  • #2
    It could be possible with static powermizer modes and nvidia-settings. Read this:

    Comment


    • #3
      Messing with frequs in the 'overclock' tab and effectively using those to underclock used to work though.

      Comment


      • #4
        Thanks for replies so far...

        Originally posted by tuke81 View Post
        It could be possible with static powermizer modes and nvidia-settings. Read this:
        http://tutanhamon.com.ua/technovodst...A-UNIX-driver/
        Thanks, the problem is,.. it is the first article coming up from "nvidia linux force performance level", which I essentially read at the very first time...

        Whilst you MAY use registry words in xorg.conf, the main driver developer - Aaron Plattner(AaronP), does not recommend this (first 4 posts) http://www.nvnews.net/vbulletin/showthread.php?t=149727 for the reason it messes with VDPAU, providing insufficient hz should one attempt to force GPU at level 0 and try to watch video. I think it is somewhere deeper - maybe HDCP requires curtain performance, and hence vdpau uses hdcp-protected way... Maybe I'm wrong.

        I would use this way, but, aside from warning from main nvidia developer, there is a discomfort of restarting xorg every time I want to fix the card to curtain performance profile.

        Originally posted by not.sure View Post
        Messing with frequs in the 'overclock' tab and effectively using those to underclock used to work though.
        Best idea so far, thanks, but..
        - does not affect the GPU voltages though (Im pretty sure chip and memory do become reduced voltages on idle state).
        -- Also, I heard about curtain voltage minimum for Hz, going past down which may damage the chip (running at low Hz and high voltage). Just thought, though...
        - no possibility to do it one-button. I will have to click-enable overclocking, scroll down and accept license, set both gpu and memory... every time I restart the machine..

        Another funny thing, check this:
        1) This is nvidia 6100M, very very weak mobile gpu chip:

        2) And this, is nvidia 580 GTX. A 200 Watt monstrosity.


        Now imagine, both owners decide to play 2D fallout 2.... or say Battle for Wesnoth.
        Both cards will skyrocket to performance level 2 (notice there are ONLY 3 performance levels, even for fastest top card!). But GTX does not nearly need this performance level! The difference is obvious - the guy with GTX580 will fund just another gtx 580 but for electrical company in form of a bill.....

        This surely needs great deal of attention. Not only there is no possibility to lock the desired performance level(although you can set it to adaptive or performance, yet not to "minimal" via nvidia-settings cli), the performance level grades are extremely unbalanced - seriously, why not give gtx580 SIX levels of performance and let it scale along them... :Im confused:
        Last edited by crazycheese; 22 June 2011, 10:15 PM.

        Comment


        • #5
          Originally posted by crazycheese View Post
          But GTX does not nearly need this performance level! The difference is obvious - the guy with GTX580 will fund just another gtx 580 but for electrical company in form of a bill.....
          Lol HUGE exaggeration there unless you are paying well over $1 per killowatt hour and running at max performance levels 24/7. When it comes to graphics the last thing you want to do is to start having the system jump in between power levels where the demand on the graphics capability can very greatly, even in the same application.

          Comment


          • #6
            Originally posted by deanjo View Post
            Lol HUGE exaggeration there unless you are paying well over $1 per killowatt hour and running at max performance levels 24/7. When it comes to graphics the last thing you want to do is to start having the system jump in between power levels where the demand on the graphics capability can very greatly, even in the same application.
            1kW/h = .20€ (~.34$, note this is BEST price on market. Normal (gesetzlicher Tarif) is well over 0.50$. Tendency is going UP due to Fukushima(global closing of A-works, switching to green sources.)
            Gaming with GTX 580 card for 2 hours every day (2D game...):
            0,240(w) x 2 x 7 x4 x 12 x 0.34($) = 54,84 $ for CARD ONLY !!!!
            (with right downclocking) 0,040(w) x 2 x 7 x 4 x 12 x 0.34($) = 9,14 $

            Each year this bug costs european gtx 580 users near 45$.
            Being fanboy does not help fix bugs....
            Last edited by crazycheese; 23 June 2011, 05:48 AM.

            Comment


            • #7
              Originally posted by crazycheese View Post
              1kW/h = .20? (~.34$, note this is BEST price on market. Normal (gesetzlicher Tarif) is well over 0.50$. Tendency is going UP due to Fukushima(global closing of A-works, switching to green sources.)
              Gaming with GTX 580 card for 2 hours every day (2D game...):
              0,240(w) x 2 x 7 x4 x 12 x 0.34($) = 54,84 $ for CARD ONLY !!!!
              (with right downclocking) 0,040(w) x 2 x 7 x 4 x 12 x 0.34($) = 9,14 $

              Each year this bug costs european gtx 580 users near 45$.
              Being fanboy does not help fix bugs....
              And you can buy a GTX-580 for $45? And BTW it is not a bug if it works as it was designed to do.

              (PS, power here is $0.08/kWh)

              Comment


              • #8
                The NVIDIA blob is closed-source, so you can not change the hardcoded options. And the Nouveau driver doesn't yet support the 580 well, let alone the performance levels.

                Comment


                • #9
                  Originally posted by crazycheese View Post
                  - does not affect the GPU voltages though (Im pretty sure chip and memory do become reduced voltages on idle state).
                  -- Also, I heard about curtain voltage minimum for Hz, going past down which may damage the chip (running at low Hz and high voltage). Just thought, though...
                  - no possibility to do it one-button. I will have to click-enable overclocking, scroll down and accept license, set both gpu and memory... every time I restart the machine..
                  Voltages, yeah, it probably doesn't change those, but I'm not sure that's dangerous. Hard to imagine but hey, we've seen pigs fly.

                  You should be able to make it 'one-button' by writing a script and then executing it in a startup script or binding it to some desktop button. Most if not all nvidia-settings options can be set via command line, check out nvidia-settings -q all.
                  Way back when, I also used nvclock to do mess with frequencies, that was even easier. But nvclock probably doesn't support fermi chips yet.
                  All this worked great over years for my FX570M (=GF8600M).

                  Comment


                  • #10
                    There's no Fermi support indeed. But it's possible that you can't run nvclock when using the proprietary driver - or the nvidia blob might put the timings back to the 3rd level automatically.

                    Comment

                    Working...
                    X