Announcement

Collapse
No announcement yet.

The Power Consumption & Efficiency Of Open-Source GPU Drivers

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Limiting to your refresh rate more or less (maybe 61 FPS with 60 Hz screen to reduce tearing) is the best idea, it's as smooth and saves huge amounts of power (which in turns saves noise, temperature, hardware usage…). If only game developers set that as a default…

    Most “progamers” will say it sucks, but at least your graphics card doesn't burn for nothing.

    Comment


    • #12
      Originally posted by Calinou View Post
      Limiting to your refresh rate more or less (maybe 61 FPS with 60 Hz screen to reduce tearing) is the best idea, it's as smooth and saves huge amounts of power (which in turns saves noise, temperature, hardware usage?). If only game developers set that as a default?

      Most ?progamers? will say it sucks, but at least your graphics card doesn't burn for nothing.
      As others have pointed out, vsync can have detrimental effects on input latency. Also, there was a time when it actually had a serious negative impact on performance, though I haven't seen a game where that happens in a very long time and was only noticeable on low-end GPUs.

      As for GPUs burning up, that really only applies to a small handful of models that were poorly designed, such as the GTX 480. I'm not saying that GPU was crap, but it was a relatively lazy design and it's TDP shows it. I don't feel so great about the R9 290X either but at least that GPU won't kill itself.

      Comment


      • #13
        Originally posted by schmidtbag View Post
        As others have pointed out, vsync can have detrimental effects on input latency. Also, there was a time when it actually had a serious negative impact on performance, though I haven't seen a game where that happens in a very long time and was only noticeable on low-end GPUs.
        as others have pointed out something they read on teh internets

        it depends on the engine
        on the main loop to be more precise

        drawing takes a finite amount of time
        time that can be "predicted" (more or less)
        so if the engine adapts its timings it can get pretty low, even with leaving some head room
        (problem being ofc that if a deadline is missed there will be a frame dropped)
        more on loops http://entropyinteractive.com/2011/0...the-game-loop/

        there is also latency caused by the Xorg server (and a little from the whole input stack in the kernel)
        (thou i think this was reduced enough a while ago)

        then there is what double and triple buffering brings
        and so on and so forth

        anyway, i play cs 1.6 and xonotic with vsync
        would probably help me to turn it off
        counter argument being the awful tearing when turning over 90? in a fraction of a second
        but hey, at least it's not hellishly hot in my room

        PS source engine, dota2 version to be precise, drops to half the sync rate if too many frames are dropped
        this also reduces jitter, and jitter is bad for humans (and may or may not be more apparent without vsync)
        Last edited by gens; 27 July 2014, 09:33 AM.

        Comment


        • #14
          Michael, I beg you, please, next when you test closed source drivers, test the very same cards and very same benchmarks! Please!

          Comment


          • #15
            Originally posted by brosis View Post
            Michael, I beg you, please, next when you test closed source drivers, test the very same cards and very same benchmarks! Please!
            I always do, when explicitly running an open vs. closed in a single article, as is the case next week.
            Michael Larabel
            https://www.michaellarabel.com/

            Comment

            Working...
            X