Announcement

Collapse
No announcement yet.

It Looks Like Intel Could Begin Pushing Graphics Tech More Seriously

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Opossum View Post
    (and besides, I noticed at least my skylake GPU idling at desktop consumes far, far less than 1 watt of power. Encoding 1080p@60fps video? 3-5 watts. Stress testing with Unigine Heaven? 15 watts. At least, that's what HWiNFO 64 reports.)
    Originally posted by starshipeleven View Post
    On a part that most likely has around 15W total TDP (and a cooling system sized accordingly), yeah that matters.
    On my Skylake desktop i7 it ends up being only around 0.07W, while driving triple head in my case, which is less than half of a percent of the TDP for a laptop CPU (15W). The laptop CPUs probably use even less.

    Code:
    # powerstat -D -a -g
    
    Summary:
    CPU:   0.82 Watts on average with standard deviation 0.15  
    GPU:   0.07 Watts on average with standard deviation 0.05

    Comment


    • Originally posted by duby229 View Post
      You do realize that when a bin of chips is paid for, it's long before it ever got made? Right?
      FYI, most B2B contracts aren't usually fully paid in advance.

      There is not a single chip Intel produces that a consumer bought from them, You are not Intel's customer dumbass!
      Yeah, the CPUs sold separately in boxes with "Intel" all over them are made by some unnamed chinese OEM, everyone knows that.

      Comment


      • Originally posted by duby229 View Post
        You talk about magic bullets and then try to blame that on me? Jackass.
        Heh, I'm not the one that thinks that mass production can solve all issues. That's what "magic bullet" is.

        I know for damn sure that Intel's GPU are -not- adequate right now and they sure as fuck do have the expertise and the ability to make them so. And their market share is so big, it wouldn't cost nothing but a few pennies each.
        As already said, you don't know enough of the field.
        This can't happen because:
        -costs and performance will suffer
        -even a very powerful iGPU is going to lose badly vs a 100$ dedicated GPU or laptop equivalent (nowadays they fit desktop GPUs on laptops too)
        -there is not enough market anyway to justify it

        Comment


        • Originally posted by calc View Post
          On my Skylake desktop i7 it ends up being only around 0.07W, while driving triple head in my case, which is less than half of a percent of the TDP for a laptop CPU (15W). The laptop CPUs probably use even less.

          Code:
          # powerstat -D -a -g
          
          Summary:
          CPU: 0.82 Watts on average with standard deviation 0.15
          GPU: 0.07 Watts on average with standard deviation 0.05
          Driving 2D screens isn't exactly an intensive job.
          3D or hardware decoding is.

          Comment


          • Originally posted by starshipeleven View Post
            Driving 2D screens isn't exactly an intensive job.
            3D or hardware decoding is.
            Though to be fair, you were arguing the premise that it's doing 2D--and not much of it, but still causing a significant amount of heat:

            Originally posted by starshipeleven View Post
            I bunched together 2 things,
            -Desktop users where the iGPU is in fact shut down but still wasting die space that would have been better used for whatever, even just a bigger L3/4 cache
            -Laptop users where the iGPU has the screens attached so it is always on, and while still wasting die space it also wastes thermal budget on a very thermally-constrained part already.

            On a part that most likely has around 15W total TDP (and a cooling system sized accordingly), yeah that matters.


            Secondly, interpret this however you want:
            My setup is a 1440p60 hooked up to nvidia's GTX 960 and a 1080p60 hooked up to Intel's HD 530. Testing platform is Windows 10 (can't do this test in linux). I can force which graphic card firefox uses for its hardware acceleration by temporarily setting the respectively-attached monitor to be the OS's main display, then launching firefox (or any gpu-using thing like games).

            I am playing the same 1080p60 youtube video in all the decoding entries.
            "decode+1440p" means I am decoding the video, and sending the frames to the 1440p monitor, which is attached to the nvidia; similarly, "decode+1080p" means decoding the video and sending the frames to the 1080p monitor, which is attached to the intel.
            Numbers are eyeballed averages over a few seconds.

            decode with HD 530:
            idle: 1.2w
            decode+1440p: 2.8w
            decode+1080p: 2.1w
            render nvidia frames: 1.6w

            decode with GTX 960:
            idle: 12w
            decode+1440p: 30-32w
            decode+1080p: 29-30w
            render intel frames: 14-16w or 28-29w (suspect this is due to frequency+voltage scaling)

            Comment


            • Originally posted by caligula View Post

              The Steam recommendations on Linux seem bogus. For instance they claim the games require more memory than on Windows. I wonder how that works. My Windows setup uses 1 GB when fully started, my Linux game machine 270 MB. That's 800 MB of headroom. The game might recommend 2 GB on Windows and 3-4 on Linux. Sooo... game uses 1 GB on Windows, 2.7 to 3.7 gigs on Linux. Yea rite.
              My comment was about Intel's drivers generally speaking, without taking any OS into consideration. Your point is still valid though.

              Comment


              • Originally posted by Opossum View Post
                Though to be fair, you were arguing the premise that it's doing 2D--and not much of it, but still causing a significant amount of heat:
                Many programs use hardware accelerated rendering even for 2D. That's when the iGPU generates heat. Not when sitting idle just refreshing screens.
                Secondly, interpret this however you want:
                My point was about having a powerful iGPU being pointless outside of lower end segments of the market (APUs for example are midrange and low end, and there it makes sense), and that an iGPU in the CPU die is a waste of space and thermal budget that would have been better used for something else CPU-related instead, for higher end systems.

                Comment


                • Originally posted by starshipeleven View Post
                  Many programs use hardware accelerated rendering even for 2D. That's when the iGPU generates heat. Not when sitting idle just refreshing screens.
                  My point was about having a powerful iGPU being pointless outside of lower end segments of the market (APUs for example are midrange and low end, and there it makes sense), and that an iGPU in the CPU die is a waste of space and thermal budget that would have been better used for something else CPU-related instead, for higher end systems.
                  There will be those market segments of course, but regardless there -is- a minimum performance requirement that every GPU needs to meet for it's generation. And Intel is the only GPU maker that almost all of there GPU's don't meet that minimum performance.

                  Comment


                  • Originally posted by duby229 View Post
                    There will be those market segments of course, but regardless there -is- a minimum performance requirement that every GPU needs to meet for it's generation.
                    Ah come on, stop this communism already.
                    You're removing the freedom of choosing a office PC and forcing everyone to buy a gaming system even if they don't need it (and pay for it), just because someone upstream decided what is better for them and they must obey.
                    True Murricans value Freedom, they would not force other fellow Murricans to be less free. Freedom! God Bless Murrica!

                    Comment


                    • Originally posted by starshipeleven View Post
                      Ah come on, stop this communism already.
                      You're removing the freedom of choosing a office PC and forcing everyone to buy a gaming system even if they don't need it (and pay for it), just because someone upstream decided what is better for them and they must obey.
                      True Murricans value Freedom, they would not force other fellow Murricans to be less free. Freedom! God Bless Murrica!
                      That's your own prejudice, someday you'll have to deal with that. Expecting the things you buy to do what you want is a type of freedom, the better type.
                      Last edited by duby229; 22 July 2017, 03:44 PM.

                      Comment

                      Working...
                      X