Announcement

Collapse
No announcement yet.

Intel Gen12 Graphics Bring EU Fusion - EUs Fused In Pairs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by DavidC1 View Post
    There's no benefit for them making a smaller iGPU for higher end CPUs, because they base all dies on the highest configuration anyway.
    Afaik this isn't true. Yes the component designs are copy-paste (cores and EUs of the iGPUs are the same blocks added in multiple copies), but the actual silicon isn't, they make at least a dualcore line and a quad line for desktop and another two lines for dual and quad for laptops.
    See for example the die pics on wikichip about broadwell, there are at least 4+ different die production lines for it. https://en.wikichip.org/wiki/intel/m...adwell_(client)

    So in case of the high-end chips the worst it can go is still midrange and it still makes little sense to have a great iGPU on a i5 that is most likely paired with a dedicated card.


    Comment


    • #12
      Originally posted by starshipeleven View Post
      Afaik this isn't true. Yes the component designs are copy-paste (cores and EUs of the iGPUs are the same blocks added in multiple copies), but the actual silicon isn't, they make at least a dualcore line and a quad line for desktop and another two lines for dual and quad for laptops.
      While its true they need a new mask for silicon, Intel's designs since Sandy Bridge(6th Gen core, 2011) has been designed in a way to easily add and remove cores, so it doesn't require as much at the design level This is why they have numerous different dies.

      Intel Atoms for example had 5 different dies, and the ASP and the volumes are far lower than Core. This wouldn't be possible if making a new die was so expensive or complex. You still need the base die, but this approach makes it much, much easier and less costly. It also means their factories are set up in a way to mass produce multiple different dies.

      Once you have the largest die(for example 4770K), you just cut the amount of cores and graphics to make smaller ones. They are not inflexible as people think, or used to be.

      Some designs are indeed very much different, such as comparing between Atom and Core, or their client chips versus server. Those require much more work. But within the segment its very flexible. It's true even for their graphics. Every generation they talk about increased scalability. It's not something we care about, but matters to them.

      And there's no reason to think it'll cost much less for us by not including one, or having a very small one. The markets that do need the graphics many times over pay for those that think they don't. I also know there are those buying in the $150+ segment that care about having the fastest iGPU. Otherwise there wouldn't be a thread in Anandtech forums grumbling about Ryzen 4000 having its number of CUs cut.

      Just get the iGPU-less F CPU if you don't care about the graphics.
      Last edited by DavidC1; 07-17-2020, 08:28 PM.

      Comment


      • #13
        Originally posted by DavidC1 View Post
        While its true they need a new mask for silicon, Intel's designs since Sandy Bridge(6th Gen core, 2011) has been designed in a way to easily add and remove cores, so it doesn't require as much at the design level This is why they have numerous different dies.
        This is how everyone does it. AMD(before the chiplets), NVIDIA, random ARM SoC manufacturer...

        Once you have the largest die(for example 4770K), you just cut the amount of cores and graphics to make smaller ones. They are not inflexible as people think, or used to be.
        They are cutting them at design stage and not in the silicon, which does not justify adding crap that isn't needed by the whole product line.
        Let me repeat, on desktop none is giving a fuck about the iGPU, as long as it is enough for office and browsing it's enough for low end, and high end too.
        If they were just caring about what desktop consumer wanted in that desktop consumer part, the "high end GPU" on the "largest die" would be the crappy basic one, and every chip would get the same iGPU.

        And there's no reason to think it'll cost much less for us by not including one, or having a very small one.
        Yes there is. O boy yes. It's stuff that isn't developed and isn't used, the costs would be less. The point is that Intel isn't developing only Desktop and must sell on those markets and pay development for that.

        The markets that do need the graphics many times over pay for those that think they don't.
        No I think in this specific case (like in most others) it's the reverse. Higher end parts are paying for the development of the lower end parts.
        Just like with businness graphics cards vs gaming cards. Or with consumer PC vs HPC/Server. The consumer product exists only because a large chunk of the development is in common with and paid by the businness segment, and the manufacturer decides it's worth it to make a crapified version for a dramatically lower price with VERY small margins (or even at a slight loss).

        So for the iGPUs to not suck in laptops where it actually matters to many, they have to develop them, but cannot just have the laptop market take the whole brunt of the costs as it would be too high and would therefore backfire on them. So they add powerful iGPUs to desktop chips too, and add a few 20-30$ on top of a 300-400$ CPU where it's not going to break anyone's back.

        Boom, bean counting problem solved, everyone is happy.

        I also know there are those buying in the $150+ segment that care about having the fastest iGPU. Otherwise there wouldn't be a thread in Anandtech forums grumbling about Ryzen 4000 having its number of CUs cut.
        Yo it's the laptop crowd, exactly what I'm talking about.

        Just get the iGPU-less F CPU if you don't care about the graphics.
        We are talking of Intel here, and for most of them there are no iGPU-less CPUs until we go into the Enthusiast product line that is very expensive.

        AMD for example has most CPUs that have no iGPU and only a couple APUs that have relatively powerful iGPU (and it makes sense). They can do that because the development cost of the iGPU of an APU is shared with their dedicated GPU division (as an APU is literally integrating a dedicated GPU design that was only modified a little to become an iGPU), but Intel does not have that so they have to make do with these ass-backwards decisions like placing powerful iGPU in CPUs that will 99% be paired with a dedicated card and will never ever use it.

        Comment

        Working...
        X