Announcement

Collapse
No announcement yet.

Intel Gen12 Graphics Bring EU Fusion - EUs Fused In Pairs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • starshipeleven
    replied
    Originally posted by DavidC1 View Post
    While its true they need a new mask for silicon, Intel's designs since Sandy Bridge(6th Gen core, 2011) has been designed in a way to easily add and remove cores, so it doesn't require as much at the design level This is why they have numerous different dies.
    This is how everyone does it. AMD(before the chiplets), NVIDIA, random ARM SoC manufacturer...

    Once you have the largest die(for example 4770K), you just cut the amount of cores and graphics to make smaller ones. They are not inflexible as people think, or used to be.
    They are cutting them at design stage and not in the silicon, which does not justify adding crap that isn't needed by the whole product line.
    Let me repeat, on desktop none is giving a fuck about the iGPU, as long as it is enough for office and browsing it's enough for low end, and high end too.
    If they were just caring about what desktop consumer wanted in that desktop consumer part, the "high end GPU" on the "largest die" would be the crappy basic one, and every chip would get the same iGPU.

    And there's no reason to think it'll cost much less for us by not including one, or having a very small one.
    Yes there is. O boy yes. It's stuff that isn't developed and isn't used, the costs would be less. The point is that Intel isn't developing only Desktop and must sell on those markets and pay development for that.

    The markets that do need the graphics many times over pay for those that think they don't.
    No I think in this specific case (like in most others) it's the reverse. Higher end parts are paying for the development of the lower end parts.
    Just like with businness graphics cards vs gaming cards. Or with consumer PC vs HPC/Server. The consumer product exists only because a large chunk of the development is in common with and paid by the businness segment, and the manufacturer decides it's worth it to make a crapified version for a dramatically lower price with VERY small margins (or even at a slight loss).

    So for the iGPUs to not suck in laptops where it actually matters to many, they have to develop them, but cannot just have the laptop market take the whole brunt of the costs as it would be too high and would therefore backfire on them. So they add powerful iGPUs to desktop chips too, and add a few 20-30$ on top of a 300-400$ CPU where it's not going to break anyone's back.

    Boom, bean counting problem solved, everyone is happy.

    I also know there are those buying in the $150+ segment that care about having the fastest iGPU. Otherwise there wouldn't be a thread in Anandtech forums grumbling about Ryzen 4000 having its number of CUs cut.
    Yo it's the laptop crowd, exactly what I'm talking about.

    Just get the iGPU-less F CPU if you don't care about the graphics.
    We are talking of Intel here, and for most of them there are no iGPU-less CPUs until we go into the Enthusiast product line that is very expensive.

    AMD for example has most CPUs that have no iGPU and only a couple APUs that have relatively powerful iGPU (and it makes sense). They can do that because the development cost of the iGPU of an APU is shared with their dedicated GPU division (as an APU is literally integrating a dedicated GPU design that was only modified a little to become an iGPU), but Intel does not have that so they have to make do with these ass-backwards decisions like placing powerful iGPU in CPUs that will 99% be paired with a dedicated card and will never ever use it.

    Leave a comment:


  • DavidC1
    replied
    Originally posted by starshipeleven View Post
    Afaik this isn't true. Yes the component designs are copy-paste (cores and EUs of the iGPUs are the same blocks added in multiple copies), but the actual silicon isn't, they make at least a dualcore line and a quad line for desktop and another two lines for dual and quad for laptops.
    While its true they need a new mask for silicon, Intel's designs since Sandy Bridge(6th Gen core, 2011) has been designed in a way to easily add and remove cores, so it doesn't require as much at the design level This is why they have numerous different dies.

    Intel Atoms for example had 5 different dies, and the ASP and the volumes are far lower than Core. This wouldn't be possible if making a new die was so expensive or complex. You still need the base die, but this approach makes it much, much easier and less costly. It also means their factories are set up in a way to mass produce multiple different dies.

    Once you have the largest die(for example 4770K), you just cut the amount of cores and graphics to make smaller ones. They are not inflexible as people think, or used to be.

    Some designs are indeed very much different, such as comparing between Atom and Core, or their client chips versus server. Those require much more work. But within the segment its very flexible. It's true even for their graphics. Every generation they talk about increased scalability. It's not something we care about, but matters to them.

    And there's no reason to think it'll cost much less for us by not including one, or having a very small one. The markets that do need the graphics many times over pay for those that think they don't. I also know there are those buying in the $150+ segment that care about having the fastest iGPU. Otherwise there wouldn't be a thread in Anandtech forums grumbling about Ryzen 4000 having its number of CUs cut.

    Just get the iGPU-less F CPU if you don't care about the graphics.
    Last edited by DavidC1; 07-17-2020, 08:28 PM.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by DavidC1 View Post
    There's no benefit for them making a smaller iGPU for higher end CPUs, because they base all dies on the highest configuration anyway.
    Afaik this isn't true. Yes the component designs are copy-paste (cores and EUs of the iGPUs are the same blocks added in multiple copies), but the actual silicon isn't, they make at least a dualcore line and a quad line for desktop and another two lines for dual and quad for laptops.
    See for example the die pics on wikichip about broadwell, there are at least 4+ different die production lines for it. https://en.wikichip.org/wiki/intel/m...adwell_(client)

    So in case of the high-end chips the worst it can go is still midrange and it still makes little sense to have a great iGPU on a i5 that is most likely paired with a dedicated card.


    Leave a comment:


  • DavidC1
    replied
    Originally posted by starshipeleven View Post
    I always thought there were some bean counters behind that seemingly ass-backwards decision.
    Someone decided to put it there because in i5/i7 they could add more price so they could use that to pay for other devices where they add graphics at "a loss", or some other shenanigan like that, since at the end of the day all Intel iGPUs are the same thing, whose development cost must be spread around on multiple products.
    What decision is behind people wanting smaller iGPUs? It's the same, which is saving cost.

    There's no benefit for them making a smaller iGPU for higher end CPUs, because they base all dies on the highest configuration anyway. The lower end configurations are just cut down versions of the highest config, which coincidentally(or perhaps not so coincidentally) contains the largest iGPU.

    It'll actually cost MORE making an i7 chip with a smaller iGPU because that's an entirely different configuration. Of course they could go disable part of the iGPU but why would you do that?

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by bug77 View Post

    But why can't they do that for an i5 or better?
    I always thought there were some bean counters behind that seemingly ass-backwards decision.
    Someone decided to put it there because in i5/i7 they could add more price so they could use that to pay for other devices where they add graphics at "a loss", or some other shenanigan like that, since at the end of the day all Intel iGPUs are the same thing, whose development cost must be spread around on multiple products.

    Leave a comment:


  • bug77
    replied
    Originally posted by ferry View Post

    That's what they have :-) (joking)

    And quite effective. I have a NUC with BYT (atom) single core at 1.5GHz. It runs KDE (requires opengl) and Kodi (opengl an vaapi for decoding). It plays nicely 1080p and I never have to shut it down because it consume << 5W. No way this would work without IGP.
    But why can't they do that for an i5 or better?

    Leave a comment:


  • ferry
    replied
    Originally posted by bug77 View Post
    I wonder why no one builds a CPU having a minimalistic IGP? I don't need to waste 1/3 - 1/2 of the die size because my GPU might crap ou t some day and I won't have another GPU readily available. Just build the smallest IGP that can run an office suite and a web browser decently and you've covered everyone but gamers. "Casual" gamers (whatever that is), because the gamers I know will want a discrete anyway.
    That's what they have :-) (joking)

    And quite effective. I have a NUC with BYT (atom) single core at 1.5GHz. It runs KDE (requires opengl) and Kodi (opengl an vaapi for decoding). It plays nicely 1080p and I never have to shut it down because it consume << 5W. No way this would work without IGP.

    Leave a comment:


  • bug77
    replied
    I wonder why no one builds a CPU having a minimalistic IGP? I don't need to waste 1/3 - 1/2 of the die size because my GPU might crap ou t some day and I won't have another GPU readily available. Just build the smallest IGP that can run an office suite and a web browser decently and you've covered everyone but gamers. "Casual" gamers (whatever that is), because the gamers I know will want a discrete anyway.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by bachchain View Post

    Had no idea that Intel sponsored the Joint European Torus
    they have a lot of experience in wasting power

    Leave a comment:


  • bachchain
    replied
    Intel Gen12 Graphics Bring EU Fusion
    Had no idea that Intel sponsored the Joint European Torus

    Leave a comment:

Working...
X