Announcement

Collapse
No announcement yet.

Intel Announces CPU With HBM2 Memory & AMD Graphics

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by scottishduck View Post
    As we know, nvidia GPUs only work with nvidia CPUs on nFORCE chipsets...
    no duh.
    How many NVIDIA GPUs are embedded in other non-NVIDIA devices at the die or package level? Because that was my point. Only Tegra/Jetson does.
    Soldering chips on the same board isn't the same thing.

    NVIDIA's "custom" market is still one where NVIDIA runs the show, see their supercomputers.

    Comment


    • #52
      Originally posted by phoronix View Post
      Phoronix: Intel Announces CPU With HBM2 Memory & AMD Graphics

      Following the rumors for months about an Intel CPU with integrated AMD graphics, it has been announced today and is actually happening!..

      http://www.phoronix.com/scan.php?pag...D-Graphics-CPU
      It's probably safe to say at this point that besides Torvalds and many of the KDE devs, Intel just became a major NVidia hater. Hope NVidia chokes on Intel+AMD's success.

      I've not upgraded my Lenovo thinkpads in a while, if the T series rakes this in an upcoming release, I'm probably gonna buy it just because of my severe dislike of NVidia.
      Last edited by MartinN; 06 November 2017, 06:21 PM.

      Comment


      • #53
        Originally posted by speculatrix View Post
        they could have fitted an Intel + nVidia combination.
        Bad design choice for a mass-produced system, and also Intel + NVIDIA aren't known for offering embedded-device-budget-grade "custom" designs as both only make custom stuff only for high-end server and HPC market.

        Comment


        • #54
          I wonder if we'll see Intel commits toward AMD graphics hardware.

          Comment


          • #55
            Originally posted by chithanh View Post
            From AMD's press release, this is a semi-custom design. I wonder which company will be responsible for providing graphics drivers for this product.
            Will it be AMD? Or will it be Intel? In the latter case I fear a repeat of the PowerVR situation, where Intel would just stop publishing drivers after not too long.

            I guess AMD will take care of it. I guess it won't be too different from what they have now, so not much work - "just" the connections and intel's power management / distribution thing.
            Even if intel would block something or AMD would lack people, remember that PS... 3 4? whatever hack where people were just using the free driver stack on Gentoo after they had hacked into the device? I think it shouldn't be too different here.

            Also it seems that AMD is also trusting their own Zen / future Zen+/2 designs to be competitive even with intel+Radeon APUs. Good thing that. Besides, it gives them opportunity to sell some chip in the competition with Nvidia.
            Stop TCPA, stupid software patents and corrupt politicians!

            Comment


            • #56
              I remember almost exactly a year ago how rumors about something like this being in Intel's pipeline surfaced and Intel, using some unusually clear language, denied the whole thing.

              Probably should be used to it by now, but I still find it a bit jarring when a company just flat-out lies about something like this. Usually whenever something like this leaks out the statements are vague enough that they can be interpreted either way and the clearly worded denials are reserved for actual hoaxes.

              Still, I'm clearly not the only one who gets the feeling AMD is shooting themselves in the foot slightly here with Vega APUs finally being ready to hit shelves. Maybe AMD just came to the conclusion that they just couldn't compete with Intel+Nvidia in the higher end market and had to focus on going at the lower end market alone and the higher end market together with Intel. This would fit pretty well in with how small the GPU in the Vega-based APUs unveiled so far is.

              Still, would be nice to see what actual Vega chip is in this thing. Is it the Vega 10 used in the Vega 64, 56 and Radeon Pro WX 9100? Is that also the same chip in the iMac Pro or is this the same chip as the one in the iMac Pro? So many questions, so few answers.
              Last edited by L_A_G; 06 November 2017, 07:07 PM.
              "Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."

              Comment


              • #57
                Originally posted by chuckula View Post

                Uh, Intel is using EMIB here and it is an extremely advanced interconnect that allows for compact and high-speed I/O between completely different pieces of silicon made by completely different manufacturers on completely different lithographic processes. It's light years ahead of putting some traces on a PCB to connect to chips together like has been done since the 1970s. Additionally, it's vastly more efficient than requiring a massive silicon interposer that is cost prohibitive.
                Maybe AMD getting access to that sweet interconnect technology is part of the deal. AMD has pretty much bet on glued dies so working with a competitor in exchange for better glue might be a good trade. Plus, the obvious revenue.

                I'm sure Apple played a big role in pushing Intel into this. Hope AMD made the most of the bargaining position they were in.

                Comment


                • #58
                  Originally posted by eydee View Post
                  Inb4 all of these exclusively go into Macbooks.
                  That was the first thought. Apple and ATI (now AMD) have a very long relationship. Frankly Apple is tired of Intel’s shenanigans with forcing computer makers to choose crappy, three-yo, integrated graphics or building out a full PCIE bus just for one chip. I’d guess Apple is knocking on Intel’s door with threats to go to that new Ryzen Mobile chip, now that Thunderbolt isn’t Intel’s hostage Apple is big enough to sorce the chips and eat the cost.

                  funny how after nearly ten years of Intel holding the notebook graphics market hostage it’s broken like this. The only company with enough balls and money to beat up Intel is Apple.

                  Comment


                  • #59
                    Originally posted by speculatrix View Post
                    I wonder how hard Microsoft and Sony screwed AMD down in price in order for AMD to get their chips into the Xbox and PS4?

                    I wonder if MS and Sony helped bring AMD and Intel together to make this chip?
                    AMD doesn’t own Fab anymore so price isn’t as drastic a problem if they get a consistent royalty paycheck. And 3-7 years from one design and a few speed bumps is easy money.

                    Comment


                    • #60
                      Originally posted by Adarion View Post


                      I guess AMD will take care of it. I guess it won't be too different from what they have now, so not much work - "just" the connections and intel's power management / distribution thing.
                      Even if intel would block something or AMD would lack people, remember that PS... 3 4? whatever hack where people were just using the free driver stack on Gentoo after they had hacked into the device? I think it shouldn't be too different here.

                      Also it seems that AMD is also trusting their own Zen / future Zen+/2 designs to be competitive even with intel+Radeon APUs. Good thing that. Besides, it gives them opportunity to sell some chip in the competition with Nvidia.
                      It’s not really a threat to AMD directly. As much as we lov the scrappy underdogs, AMD will never get past the 30% range in CPUs. They never broke that long before. First, there’s whole markets that just won’t buy AMD CPUs but love ATI graphics. Like Apple, and pretty much any Enterprise level PC shopper and IT department. Fab is getting pretty scarce right now. Only a few places offer the highest end Fab, and between Consoles (Sony, Microsoft, Nintendo) (a lot if IBM Fab there) and Mobile devices (Apple & Samsung) the available Fab is critically full.

                      If AMD is “renting” the design to Intel, and Intel is facing on last-gen CPU processes then its a double win for AMD to get paid and not have to deal with manufacturing and QA.

                      Comment

                      Working...
                      X