Announcement

Collapse
No announcement yet.

Intel Offers New Xe Graphics Details, Product Updates At 2020 Architecture Day

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Jumbotron View Post
    The biggest news from this "Architecture Day" wasn't even released. It was this...

    1: You won't BELIEVE the size of the Marketing Kickback dollars we are going to give to Dell, HP, Lenovo, Asus, Acer and Google to put our shit in their systems next year and shut out decent design wins for AMD.

    2: We have come up with a new and IMPROVED glue formulation that is 10X STRONGER in order to keep all the new decals on all those Marketing Kickback funded computers. Along with the ubiquitous "Intel i-Something_Meaningless Inside" decals you will now see "Intel Xe_Something Meaningless Powered" decals, "Intel OneAPI Powered" decals, and "This Laptop/Desktop/AIO Is Powered By Marketing Kickbacks" decals festooned ALL over your device's chassis. All with new and improved 10X stronger "Intel Inside" glue to make it even MORE impossible to pry off. Intel will probably make a decal for that too.

    So you noticed too that the only good designs for amd are either gigabyte or echostream or lenovo sr655?
    Everyone else "couldn't figure out" or "we don't know how" to make a good amd system, kinda sad.
    Last edited by onlyLinuxLuvUBack; 13 August 2020, 01:51 PM.

    Comment


    • #12
      Yeah...it's an age old technique by Intel. It's Intel's way of what Trump is doing in screwing up the U.S. Postal Service in order to have a hope of winning. Distort the Market because the courts will look the other way. it's not Anti-competitive. It's HYPER-competetive. So it's all good.

      Comment


      • #13
        phoronix the Intel Newsroom link at the bottom is broken

        Comment


        • #14
          Originally posted by programmerjake View Post
          phoronix the Intel Newsroom link at the bottom is broken
          Michael please go to bed... at least once!

          Comment


          • #15
            I am actually genuinely excited by the prospects of the Xe GPUs... and oneAPI; provided that Intel support both longer term - i.e., for more than two or three years before (sort of) announcing it didn't work to expectations and dropping all support.

            I'm not expecting great things of Xe for the first generation - or even the second. If the best first-gen Xe manages to get anywhere close to an RTX2060 I'd consider that a significant success. But they will need to at least get within 10% of AMD (at parity pricing) in first/second gen and trade blows with whatever top-end nVidia offers in both the GPGPU and gaming spaces by third gen. And drivers need to not suck, too.

            The comment about "more than a generational leap in CPU performance" has me chuckling quietly, though. I would hope that jump to be so, considering that IPC hasn't really increased all that much since Sandy Bridge. I was amused recently to see a friend still running a 2600K (albeit heavily overclocked) with an RTX2070 and seeing great performance in games. Heck, he gets better performance than me in Total War: Warhammer 2 - which does, admittedly, love high-clock CPUs. AMD really kicked Intel hard with Ryzen/Epyc, even more than they did with Athlon 64.

            Comment


            • #16
              Alder Lake seems dumb. Is it really just a way for their marketing team to be able to claim they have 16 cores CPUs, even though half the cores will be shit?

              I get why big.little makes sense on laptops or anything running off battery, but a desktop?

              Comment


              • #17
                Waiting until Rikki Lake comes out. But word on the street is that it's chatty on bus.

                Comment


                • #18
                  Originally posted by smitty3268 View Post
                  Alder Lake seems dumb. Is it really just a way for their marketing team to be able to claim they have 16 cores CPUs, even though half the cores will be shit?
                  Actually, that's a really good point - AMD recently settled that ridiculous class action suit about Bulldozer not being a "proper 8-core CPU" because a two-core module had a single FPU.

                  So would this open up similar possibilities for Intel getting hit with "it's not a real XYZ-core CPU!" if all cores are not equal?

                  Comment


                  • #19
                    Originally posted by Paradigm Shifter View Post
                    I am actually genuinely excited by the prospects of the Xe GPUs... and oneAPI; provided that Intel support both longer term - i.e., for more than two or three years before (sort of) announcing it didn't work to expectations and dropping all support.

                    I'm not expecting great things of Xe for the first generation - or even the second. If the best first-gen Xe manages to get anywhere close to an RTX2060 I'd consider that a significant success. But they will need to at least get within 10% of AMD (at parity pricing) in first/second gen and trade blows with whatever top-end nVidia offers in both the GPGPU and gaming spaces by third gen. And drivers need to not suck, too.

                    The comment about "more than a generational leap in CPU performance" has me chuckling quietly, though. I would hope that jump to be so, considering that IPC hasn't really increased all that much since Sandy Bridge. I was amused recently to see a friend still running a 2600K (albeit heavily overclocked) with an RTX2070 and seeing great performance in games. Heck, he gets better performance than me in Total War: Warhammer 2 - which does, admittedly, love high-clock CPUs. AMD really kicked Intel hard with Ryzen/Epyc, even more than they did with Athlon 64.
                    My thoughts exactly! On the other hand there were some rumors lately (by Moore's Law is Dead and AdoredTV) that Intel could cancel Xe and that there were some problems they were having with Gen 13 in the labs. As there are always some problems in early development I wouldn't read too much into it, but considering the release time frame, it is up in the air if they can be competitive with AMD and Nvidia at that point in time. I hope they will deliver something interesting and new to the market though with great long-term driver support on Windows and Linux [e.g. not dropping support after three to four years as with Sandy Bridge graphics on Windows].

                    It now comes down to great execution and patience of the higher management not to cancel their efforts prematurely.

                    Comment


                    • #20
                      Originally posted by Paradigm Shifter View Post

                      Actually, that's a really good point - AMD recently settled that ridiculous class action suit about Bulldozer not being a "proper 8-core CPU" because a two-core module had a single FPU.

                      So would this open up similar possibilities for Intel getting hit with "it's not a real XYZ-core CPU!" if all cores are not equal?
                      No because literally tens of millions of devices are sold with similar core layouts every year in smartphones and Intel is smart enough not to be deceptive in its labeling of chips. As such, unlike AMD Intel would just be following standard industry practices. Something AMD could learn from since that suit was no more ridiculous than the truly idiotic suit brought against Nvidia over some minutia in the ROPs for the GTX-970 [a card that completely obliterated anything from AMD in its timeframe, BTW].

                      There were AMD fanboys who never bought the GTX-970 who acted like the GTX-970 "lie" was worse than all the warcrimes committed in the last 100 years. Then they turn around and scream that Bulldozer's "8 cores" was not in the least bit deceptive and that it's a really great design. Funny how when Zen literally dumped everything in Bulldozer and actually works best when compilers pretend it's a Haswell that the same fanboys don't dump on Zen -- I mean Bulldozer was THAT GOOD right? Then again, rank hypocrisy and technical stupidity are part of being a fanboy.

                      Comment

                      Working...
                      X