Announcement

Collapse
No announcement yet.

It Looks Like Intel Could Begin Pushing Graphics Tech More Seriously

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • duby229
    replied
    Originally posted by starshipeleven View Post
    Top of the line Intel products are usually paired with a dedicated GPU and pwns consoles, try harder.

    But... But... what about people that buy a Nintendo Wii or Shield then expect to play Halo on it?!? You no think of them? Dem kitties!
    Shit man I can point to hundreds of computers with high end Intel products that got sold in the millions to Consumer Retailers You keep insisting it's just workstations and that everything else has a dedicated card, but you are wrong. There are hundreds of millions of Intel Consumer level Retail products that don't! Hell, maybe even billions of retail machines. Which btw is -exactly- why Intel can afford to increase graphics performance Across the entire board, it would only cost a few pennies each die, because of the huge numbers sold.
    Last edited by duby229; 22 July 2017, 05:44 PM.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by duby229 View Post
    And meanwhile people buy their top of the line Intel products only to realize they can't do what they want.
    Top of the line Intel products are usually paired with a dedicated GPU and pwns consoles, try harder.

    Then they go buy a PS4 or Xbone. Which does.
    But... But... what about people that buy a Nintendo Wii or Shield then expect to play Halo on it?!? You no think of them? Dem kitties!

    Leave a comment:


  • duby229
    replied
    Originally posted by starshipeleven View Post
    Expecting the things you buy to not be overpriced and carry useless weight/performance penalty because some third party mandates it is Freedom.

    Mandating that everything should be like you want is Communism. Stop playing Stalin plz.
    And meanwhile people buy their top of the line Intel products only to realize they can't do what they want. Then they go buy a PS4 or Xbone. Which does. Again, one more time, that's how Intel has harmed PC gaming. The only solution is to meet a minimum performance at each generation.
    Last edited by duby229; 22 July 2017, 04:11 PM.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by duby229 View Post
    That's your own prejudice, someday you'll have to deal with that. Expecting the things you buy to do what you want is a type of freedom, the better type.
    Expecting the things you buy to not be overpriced and carry useless weight/performance penalty because some third party mandates it is Freedom.

    Mandating that everything should be like you want is Communism. Stop playing Stalin plz.

    Leave a comment:


  • duby229
    replied
    Originally posted by starshipeleven View Post
    Ah come on, stop this communism already.
    You're removing the freedom of choosing a office PC and forcing everyone to buy a gaming system even if they don't need it (and pay for it), just because someone upstream decided what is better for them and they must obey.
    True Murricans value Freedom, they would not force other fellow Murricans to be less free. Freedom! God Bless Murrica!
    That's your own prejudice, someday you'll have to deal with that. Expecting the things you buy to do what you want is a type of freedom, the better type.
    Last edited by duby229; 22 July 2017, 03:44 PM.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by duby229 View Post
    There will be those market segments of course, but regardless there -is- a minimum performance requirement that every GPU needs to meet for it's generation.
    Ah come on, stop this communism already.
    You're removing the freedom of choosing a office PC and forcing everyone to buy a gaming system even if they don't need it (and pay for it), just because someone upstream decided what is better for them and they must obey.
    True Murricans value Freedom, they would not force other fellow Murricans to be less free. Freedom! God Bless Murrica!

    Leave a comment:


  • duby229
    replied
    Originally posted by starshipeleven View Post
    Many programs use hardware accelerated rendering even for 2D. That's when the iGPU generates heat. Not when sitting idle just refreshing screens.
    My point was about having a powerful iGPU being pointless outside of lower end segments of the market (APUs for example are midrange and low end, and there it makes sense), and that an iGPU in the CPU die is a waste of space and thermal budget that would have been better used for something else CPU-related instead, for higher end systems.
    There will be those market segments of course, but regardless there -is- a minimum performance requirement that every GPU needs to meet for it's generation. And Intel is the only GPU maker that almost all of there GPU's don't meet that minimum performance.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by Opossum View Post
    Though to be fair, you were arguing the premise that it's doing 2D--and not much of it, but still causing a significant amount of heat:
    Many programs use hardware accelerated rendering even for 2D. That's when the iGPU generates heat. Not when sitting idle just refreshing screens.
    Secondly, interpret this however you want:
    My point was about having a powerful iGPU being pointless outside of lower end segments of the market (APUs for example are midrange and low end, and there it makes sense), and that an iGPU in the CPU die is a waste of space and thermal budget that would have been better used for something else CPU-related instead, for higher end systems.

    Leave a comment:


  • wdb974
    replied
    Originally posted by caligula View Post

    The Steam recommendations on Linux seem bogus. For instance they claim the games require more memory than on Windows. I wonder how that works. My Windows setup uses 1 GB when fully started, my Linux game machine 270 MB. That's 800 MB of headroom. The game might recommend 2 GB on Windows and 3-4 on Linux. Sooo... game uses 1 GB on Windows, 2.7 to 3.7 gigs on Linux. Yea rite.
    My comment was about Intel's drivers generally speaking, without taking any OS into consideration. Your point is still valid though.

    Leave a comment:


  • Opossum
    replied
    Originally posted by starshipeleven View Post
    Driving 2D screens isn't exactly an intensive job.
    3D or hardware decoding is.
    Though to be fair, you were arguing the premise that it's doing 2D--and not much of it, but still causing a significant amount of heat:

    Originally posted by starshipeleven View Post
    I bunched together 2 things,
    -Desktop users where the iGPU is in fact shut down but still wasting die space that would have been better used for whatever, even just a bigger L3/4 cache
    -Laptop users where the iGPU has the screens attached so it is always on, and while still wasting die space it also wastes thermal budget on a very thermally-constrained part already.

    On a part that most likely has around 15W total TDP (and a cooling system sized accordingly), yeah that matters.


    Secondly, interpret this however you want:
    My setup is a 1440p60 hooked up to nvidia's GTX 960 and a 1080p60 hooked up to Intel's HD 530. Testing platform is Windows 10 (can't do this test in linux). I can force which graphic card firefox uses for its hardware acceleration by temporarily setting the respectively-attached monitor to be the OS's main display, then launching firefox (or any gpu-using thing like games).

    I am playing the same 1080p60 youtube video in all the decoding entries.
    "decode+1440p" means I am decoding the video, and sending the frames to the 1440p monitor, which is attached to the nvidia; similarly, "decode+1080p" means decoding the video and sending the frames to the 1080p monitor, which is attached to the intel.
    Numbers are eyeballed averages over a few seconds.

    decode with HD 530:
    idle: 1.2w
    decode+1440p: 2.8w
    decode+1080p: 2.1w
    render nvidia frames: 1.6w

    decode with GTX 960:
    idle: 12w
    decode+1440p: 30-32w
    decode+1080p: 29-30w
    render intel frames: 14-16w or 28-29w (suspect this is due to frequency+voltage scaling)

    Leave a comment:

Working...
X