Announcement

Collapse
No announcement yet.

NVIDIA 387.12 Vulkan vs. OpenGL Performance Across Multiple CPUs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    bridgman So, CPU utilization alone is not proper measure for CPU usage? That doesn't make much sense to me.

    I respect the fact that you guys know much more than me on this topic, but my point is that term "bottleneck" is concept term, not real term as "hertz" or "cycle", there's always "bottleneck", but logically, as long as resources are there, it's not resource bottleneck, and that's my problem with that term. In this concrete example (games) you never have CPU bottleneck, not with decent CPU and all of the tested CPU's are decent.

    Now where is software limitation, in game, in driver or API itself, that's another question, that's how I see it. I see where gamerk2 is coming from with latency, but logically it's not CPU bottleneck even if that is the case.

    Comment


    • #22
      Originally posted by leipero View Post

      I have serious problems with terminology reviewers use (especially Windows reviewers), CPU bottleneck? Ok, does CPU hits 100% utilization? If so, yes it is CPU bottleneck, if not, then it is not CPU bottleneck.

      There are problems in programming (I assume) to push FPS to it's potential in any game/application. For example, single threaded game:
      Max Settings = 60 FPS
      High Settings = 130FPS

      The difference? Shadows (calculated by CPU) quality, now:
      Low Settings = 140 FPS

      The difference? Low textures, no AF, no AA, no dynamic lights etc., 10 FPS for all that? Now, another test:
      GPU to CPU synchronisation from 0 frames to 1 frame:
      High Settings = 170 FPS
      GPU to CPU synchronisation from 0 frames to none:
      High Settings = 180 FPS

      But there's side effect of that, you may gain those 40FPS or even more, but input latency goes up, if you set to none it migh go up as 4 times or more..., sadly most game developers use those cheap tricks in their games, and that is main reason you can't get really "direct games" in last ~10 years or so, so isntead of developing better API (such as Vulkan or something even better), they resort to cheap tricks that ruin gameplay and make games not fun to play. I'm not sure, but I think last games that used imidiate GPU-CPU synchronization in FirstPersonShooter genre were Source Engine based games, I'm not even sure if CS GO is still there, or moved to at least 1 frame, I don't know.

      Btw, all numbers I've presented here are "pulled out of my axx", it's just an example.
      I really had that impression like you, but you have greater knowledge and better explained. Thanks a lot for your comments!

      Comment


      • #23
        Originally posted by leipero View Post
        bridgman So, CPU utilization alone is not proper measure for CPU usage? That doesn't make much sense to me.
        I'm not saying that... just that CPU utilization alone is not proper measure for whether or not there is a "CPU bottleneck" - which is a fairly vague term but since everyone seems to want to talk about it we should have a definition.

        Originally posted by leipero View Post
        I respect the fact that you guys know much more than me on this topic, but my point is that term "bottleneck" is concept term, not real term as "hertz" or "cycle", there's always "bottleneck", but logically, as long as resources are there, it's not resource bottleneck, and that's my problem with that term. In this concrete example (games) you never have CPU bottleneck, not with decent CPU and all of the tested CPU's are decent.
        Call it a driver bottleneck if that makes you feel better, but that term is too vague so you end up talking about "that portion of driver bottleneck where CPU speed significantly affects performance" and that in turn gets abbreviated to "CPU bottleneck".

        Originally posted by leipero View Post
        Now where is software limitation, in game, in driver or API itself, that's another question, that's how I see it. I see where gamerk2 is coming from with latency, but logically it's not CPU bottleneck even if that is the case.
        If the CPU is not infinitely fast you have potential for a CPU bottleneck; if GPU is not infinitely fast you have potential for a GPU bottleneck. In both cases we should really replace "bottleneck" with something like "dominant contributor to performance changes with this specific workload / API / driver / CPU /GPU" but that is too wordy for most people.

        Terms get abbreviated for convenience, but the abbreviated term rarely makes sense in isolation unless you drag the context along with it. "CPU bottleneck" and "GPU bottleneck" are just examples of that.
        Test signature

        Comment


        • #24
          timofonic Oh trust me, I have no clue, I'm just guessing from experience.

          bridgman Honestly it have no influence on how it would make me feel, but as you said, it's very vague concept people are trowing around and there should be clear definition of it (if possible at all to have clear definition of simplified concept of complex topic).

          Ofc. there will be always some component that would be "bottleneck", either GPU or CPU or software, it happened to me to hit actual CPU bottleneck on Athlon x2 and Core2Duo back in 2011 (or what it was?) with Need for Speed the Run on Windows platform. It just so happens that one of my favorite games would be CPU "bottlenecked" even with i7 at 6Ghz with fairly slow GPU's (something in RX550 class or even as low as r5 250), yet (especially Windows) reviewers tend to abuse that term so much, even tho they should know better.

          Comment


          • #25
            leipero Maybe the game has some bad coding? I only guess it, of course.

            Does it work under Wine?

            Comment


            • #26
              Originally posted by timofonic View Post
              leipero Maybe the game has some bad coding? I only guess it, of course.

              Does it work under Wine?
              Of course it have bad coding (as every game), that's my whole point, I'm not complaining, just pointing things out for terminology. Yes, all games I play work under wine, and they work great, I'm not playing that often, but when I do i want it to work well, and it does. If you are interested in testing it, hit me up with Pm to avoid poluting the thread.

              Comment


              • #27
                Originally posted by faldzip View Post
                ...I don't know why more then 1 CPU is used for that.
                I'm pretty sure that if Michael only used 1 lower-end CPU, the comment section would be filled with people complaining that it's meaningless without a more powerful CPU to compare to, and if he had used only 1 higher-end CPU, the opposite complain would happen. Personally I would have liked to see an even slower CPU at the low-end added to the mix.

                Comment

                Working...
                X