Announcement

Collapse
No announcement yet.

Intel Announces CPU With HBM2 Memory & AMD Graphics

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Awesome, now we get tearing graphics on laptops too! I always avoid AMD graphics because frankly they have always sucked on linux. I liked that most laptops had boring old "intel graphics." Why must everything have #%#$#$ gaming GPUs? The "gaming" man babies have ruined computing.

    Comment


    • #62
      Hopefully this makes Intel abandon their OpenCL implementation (i.e. Beigenet) in favor or something that works on Radeon (and Intel) GPUs. Perhaps it is time for Mesa Clover to be resurrected?

      Lack of decent and easily accessible OpenCL implementation on Radeon cards is the only reason why we still use Nvidia for number crunching at work. Getting ROCm to work on up to date distros is PITA.

      My hope is that this changes in the near future - hopefully once the kernel 4.15 is out, ROCm will work in userspace so it is easy to install on up to date distros like Fedora. Until then, the test Vega cards we got as a proof of concept are collecting dust in the bin.


      Comment


      • #63
        Originally posted by eggbert View Post
        Awesome, now we get tearing graphics on laptops too! I always avoid AMD graphics because frankly they have always sucked on linux. I liked that most laptops had boring old "intel graphics." Why must everything have #%#$#$ gaming GPUs? The "gaming" man babies have ruined computing.
        It's 2017, maybe give modern-day AMD and Intel graphics a try before spreading misinformation.

        Comment


        • #64
          People seem to be oblivious of the Toshiba memory consortium purchase. SK Hynix and Apple [especially Apple] invested in getting that business for $18.1 Billion. The licensing between members for their portfolio and availability expansion of HBM2 seems rather timely concerning Apple's all-in with Vega and HBM2.

          More than anything, this is Intel licensing IP from AMD to keep Apple from dumping them completely, outside of LTE modem for wireless needs. This agreement most likely doesn't even exist w/o Zen/Vega/HBM2 and Apple iMac Pro and future Mac Pro having options other than Intel.

          Ultimately, with Thunderbolt opening up Apple can dump Intel all-together in 2018 and go Zen+/Zen2 Vega/Navi with their own custom boards adding TBolt 3 and never looking back.

          AMD wins either way. SK Hynix needs Apple's business, AMD needs Apple's business, Intel's leveraged the hell out of Apple's business for 10 years. Samsung just lost out to LG for Apple OLED in 2018 and beyond: a reason the CEO of Samsung retired and conceded the future more dim than their latest quarter benefiting from Apple's iPhone X contract and much more. Samsung jacked up all the prices on HBM2 as SK Hynix had not the resources to provide large quantities.

          AMD suffered, gaming suffered and only Miners got the bits available, at inflated prices.

          After the Toshiba deal, within 45 days we have all this news, AMD's GPGPU lines are MSRP again, availability for all is common and everyone is peeing themselves over this Intel announcement?

          The big deal is how AMD, SK Hynix, TSMC, Apple and anyone not Intel and Samsung fair, never mind Nvidia still a one trick pony no one cares about beyond their demoed AI smart systems and Gaming.

          Intel isn't worried about Nvidia taking them on in CPU market. They are worried Apple's ready for a divorce and AMD enjoys the double dating.

          Comment


          • #65
            I know you'll all just skip over this comment and post away anyway, but I'll mention it just for the record:

            These cpu's are still going to have an Intel GPU on them, and it will be the default. The AMD gpu is on for use as a secondary GPU, the same way current intel laptops have NVidia discrete graphics as a second option when you need faster performance.

            So the driver is certainly going to come from AMD, just like NVidia's currently come from NVidia. And you won't see less Intel developers working on their linux driver - it's still the default experience, and you'll get the same experience you always have from them. You'll just hopefully see a bit better PRIME setup since all the drivers are OSS and can work together better.

            Comment


            • #66
              At first glance I thought that it was a stupid move by AMD. But with proper execution, this just might work out well for AMD. They not only get cash, but also get some brand recognition (unless Intel decides to do some really shady thing by not advertising that the graphics are powered by Radeon).

              Some (including me) were concerned that AMD might be shooting themselves in the foot. But on much thought, I don't think so. It is not like AMD has much to lose because as the only people who buy AMD are the ones who already know about them. They're not going to lose those customers if they keep providing good products. As for people who want the best of both worlds, now there is a solution!!! It seems like it's a combined win for Intel and (more pressingly) AMD who can fight nVidia and keep them in check, and a win for us the users.

              As another member mentioned, this also seems like AMD are confident that this collaboration won't affect their own Zen/Zen+ CPUs. It means that AMD are confident about their CPU designs as well as GPU designs, which can only be a good thing. Also, I suppose AMD will only supply the chips to Intel. This is in line with what Lisa Su said that they are "not looking at enabling Intel to compete with their own products". I believe that the rumors about the GPU licensing deal with Intel were true. But AMD couldn't just sell their entire GPU IP, because they'll lose their unique advantage. To me, it seems like a genius move by AMD to say, "We won't sell you our entire IP, but we can make a compromise. We'll supply the GPUs, you can package them and sell them all you want, but give us some recognition (and sweet cash). That way it's a win for both of us (and customers)!"

              Comment


              • #67
                Originally posted by speculatrix View Post
                they could have fitted an Intel + nVidia combination.
                You mean like Microsoft did in the original Xbox? We all know how that turned out. Additionally, given the emphasis on async compute that seems inherent in the PS4 APU, none of the NVidia graphics solutions at that time was suitable.
                Originally posted by Adarion View Post
                I guess AMD will take care of it. I guess it won't be too different from what they have now, so not much work - "just" the connections and intel's power management / distribution thing.
                The bolded part is where I see the crux of the matter. Will Intel be ok with this being released in the open? When it comes to Intel graphics, part of the power management code moved from FOSS drivers to proprietary firmware this generation.

                Aargh, I knew it:
                Originally posted by PCWorld
                One interesting wrinkle: Intel will be responsible for supplying the drivers for the Radeon GPU, though company engineers won’t write the original code. An Intel representative said they’re working closely with AMD’s Radeon business to supply “day one” drivers for new games, when those drivers become available.
                https://www.pcworld.com/article/3235...s.html?cid=569 via https://www.reddit.com/r/linux_gamin...ore_chip_will/
                Michael could you try to ask your contacts at Intel/AMD about the Linux situation?
                Last edited by chithanh; 07 November 2017, 08:13 AM.

                Comment


                • #68
                  Originally posted by L_A_G View Post
                  Still, would be nice to see what actual Vega chip is in this thing. Is it the Vega 10 used in the Vega 64, 56 and Radeon Pro WX 9100? Is that also the same chip in the iMac Pro or is this the same chip as the one in the iMac Pro? So many questions, so few answers.
                  This doesn't seem to be Vega, but Polaris with HBM2 memory controller.. judging by: https://www.techpowerup.com/img/1tJbfBoky8ckWBLz.jpg Device name: gfx804

                  I guess AMD want to keep the latest generation for themselves for now. Or perhaps the actual implementation started before Vega chips were available.

                  Comment


                  • #69
                    Originally posted by uid313 View Post
                    Why didn't Intel just put some resources on beefing up their own GPU?

                    Intel haven't done shit with their GPU since the days of Haswell.
                    Four generation later and its still the same GPU with just minor tweaks.
                    NVIDIA.

                    NVIDIA is running away in a lot of markets Intel is desperately trying to get into. In addition, Intel really doesn't have a power-efficient enough GPU to make Intel a major player in the mobile market. AMD helps in both cases, as their GPU is better suited for high performance markets, while having a superior power profile to Intels.

                    That being said, there's a part of me wondering if this is a precursor for Intel to purchase outright AMD's GPU division.

                    Comment


                    • #70
                      Originally posted by Mabhatter View Post

                      It’s not really a threat to AMD directly. As much as we lov the scrappy underdogs, AMD will never get past the 30% range in CPUs. They never broke that long before. First, there’s whole markets that just won’t buy AMD CPUs but love ATI graphics. Like Apple, and pretty much any Enterprise level PC shopper and IT department. Fab is getting pretty scarce right now. Only a few places offer the highest end Fab, and between Consoles (Sony, Microsoft, Nintendo) (a lot if IBM Fab there) and Mobile devices (Apple & Samsung) the available Fab is critically full.

                      If AMD is “renting” the design to Intel, and Intel is facing on last-gen CPU processes then its a double win for AMD to get paid and not have to deal with manufacturing and QA.
                      There are so many things wrong here.

                      First, in 1999 and 2000 AMD crossed the 50% mark on CPU sales volumes a couple times. They never had anywhere near Intel's revenue, but they can and have supplied the majority of the world's processors before. Plus, companies aren't static entities. Past performance is not indicative of the future. If that were true their'd be IBMs on all of our desks and Apple would have gone out of business.

                      Enterprise shoppers have bought plenty of non-Intel systems before and if AMD is able to get design wins and be offered in competitive products through major manufacturers, enterprise will assuredly come. This is especially true in data centers, where performance per watt is the only thing that matters. EPYC clearly has an advantage here in certain workloads so it will certainly capture a significant chunk of the market. That wasn't true prior to EPYC where aging Opterons were outperformed per watt at basically 100% of things.

                      AMD can't simply move Radeon designs to Intel fabs. That's not how this is going to work. The design and tooling changes would be huge, and would require Intel to reveal more about their fab tech than they are probably willing to do. AMD will manufacture the Radeon dies (likely through their partnership with GlobalFoundaries, but maybe some with TSMC or Samsung). Intel will use the EMIB tech to interconnect them. The whole wow factor of this tech is its ability to act as a low cost interconnect between "heterogeneous silicon" (ie: manufactured on different processes and technologies) which is exactly what is going on here. AMD will have to deal with manufacturing and QA just as they do now with their other products. They will get to avoid headaches related to chip packaging and ironing out this new interconnect stuff. Hopefully they are getting some access to EMIB in exchange, but this otherwise looks a lot like AMDs other custom silicon deals with Microsoft and Sony, and may have even better IP terms for AMD than those deals as Intel likely needed them more than they needed Intel.
                      Last edited by existensil; 08 November 2017, 05:34 PM.

                      Comment

                      Working...
                      X