Announcement

Collapse
No announcement yet.

Intel Confirms Their Discrete GPU Plans For 2020

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • DMJC
    replied
    Open source support is irrelevant. NVIDIA is available on Linux/Solaris/FreeBSD. Super computer makers already have the skills/tech to support NVIDIA deployments. Desktop market is owned by Microsoft and PC gaming is on Windows. Ultimately Linux desktop makes up at best 20-40 million machines. Don't kid yourself on how valuable Linux desktop support is to which GPU maker will win the market. I love Linux, I've been using it for 18 years. But ultimately it doesn't and has never decided who wins in GPU marketshare.

    Leave a comment:


  • leipero
    replied
    PackRat
    That doesn't mean much, there are other reasons why that computer uses nvidia Tesla V100, but to be fair, I was refering to "desktop" market oriented GPU's, not "pro", in that case, Vega 64 have 12,660 GFLOPS of SP computing power, while 1080ti is close with 11,340 GFLOPS. I don't know how they calculate compute power for Tesla v100, but nvidia claims 7 to 7,8 TeraFLOPS of DP compute power, while they somehow got the number of 100+ TeraFLOPS for "deep learning". So clearly, at least on "desktop market" nvidia is still a bit behind, while it is ahead with power consumption and that's one of the reasons why those GPU's are used.

    However, miners with GPU's almost universally use AMD GPU's because simple power/price ratio and general compute power is higher than what nvidia offers. And Intel is also likely to enter that market first in my opinion.

    Leave a comment:


  • PackRat
    replied
    Originally posted by leipero View Post

    I would say RIP nvidia . Why? Because nvidia continues it's strategy of "everything proprietary" and locking down their users, Intel is unlikely (in my opinion) to go on that path for multiple reasons, one is, it would one "newcomer" (sort of) in that segment, another reason is their "tradition" of contributing to free/OS software.

    So, it's likely if Intel gets really good with GPU's, they will hurt nvidia much more, and we can potentially expect even better Mesa drivers. So far, it "smells" like cryptothingy is here to stay, and Intel saw a potential in that market, so it is resonable to assume that would be their primary target = expect GPU's more similar to AMD ones, with mroe compute power = "longer lasting" GPU's that can age well, as AMD ones, nvidia GPU's do not age well historically, simply because they lack compute power (I even remmeber back in the days of 8000 series, with 8800 doing extremely poor job in compute intensive games, while similar AMD GPU's doing much better, 8800GT simply had terrible input lag despite high FPS, even tho shader clock was high, but arround 500GFLOPS vs 800+GFLOPS was big enough to make the difference).
    World's fastest super computer is running 27.000 nvidia gpu's...
    What compute? OpenCl is Apple's tech dead on arrival. I have not drank the whole cup of wayland foss koolaid yet. The open source solution is interesting and I do think that it it's cool.

    It will be interesting to see how well Intel gpu's are.











    Leave a comment:


  • Marc Driftmeyer
    replied
    Originally posted by microcode View Post

    Since it is fairly routine for vendors to reverse engineer eachother's new designs, I think it would be risky to do anything like that. Also, in terms of timing, Intel must already have been working on this stuff in earnest at least two to four years ago, so it's not likely to hinge primarily on AMD trade secrets.
    If Intel had the IP and hardware R&D already ironed out from 24-48 months they wouldn't have cross licensed with AMD.

    These won't compete with either AMD or Nvidia.

    Intel is about to get trounced in the Server and Desktop space.

    Leave a comment:


  • GizmoChicken
    replied
    Originally posted by theriddick View Post
    I expect something like the RX580 sort of deal, would be VERY surprised if Intel can muster up a 1080ti beater, that would be nice but kinda wishful thinking.
    While I hope that Intel will offer at least a few consumer dGPUs, my guess is that Intel’s primary target will be the lucrative accelerated virtual desktops market, which explains Intel’s investment in GVT-g.

    Originally posted by leipero View Post
    I would say RIP nvidia .
    And I would say that "the report of [Nvidia's] death has been grossly exaggerated.

    Actually, in all seriousness, I would say that, if I'm right that Intel will likely target the accelerated virtual desktops market (which Nvidia currently dominates), Intel could take a bite out of Nvidia's revenues. But Nvidia has a huge lead, and the market is growing, so Nvidia probably isn't too worried yet.

    Leave a comment:


  • leipero
    replied
    Originally posted by PackRat View Post
    Rip amd
    I would say RIP nvidia . Why? Because nvidia continues it's strategy of "everything proprietary" and locking down their users, Intel is unlikely (in my opinion) to go on that path for multiple reasons, one is, it would ne "newcomer" (sort of) in that segment, another reason is their "tradition" of contributing to free/OS software.

    So, it's likely if Intel gets really good with GPU's, they will hurt nvidia much more, and we can potentially expect even better Mesa drivers. So far, it "smells" like cryptothingy is here to stay, and Intel saw a potential in that market, so it is resonable to assume that would be their primary target = expect GPU's more similar to AMD ones, with mroe compute power = "longer lasting" GPU's that can age well, as AMD ones, nvidia GPU's do not age well historically, simply because they lack compute power (I even remmeber back in the days of 8000 series, with 8800 doing extremely poor job in compute intensive games, while similar AMD GPU's doing much better, 8800GT simply had terrible input lag despite high FPS, even tho shader clock was high, but arround 500GFLOPS vs 800+GFLOPS was big enough to make the difference).

    Leave a comment:


  • theriddick
    replied
    I expect something like the RX580 sort of deal, would be VERY surprised if Intel can muster up a 1080ti beater, that would be nice but kinda wishful thinking. AMD is expected to get back into the game in early 2019, but that might not mean more powerful cards, just more efficient.

    More efficient GPU's is good news for consoles I suppose but not so great for competition given the 1080ti will be quite long in the tooth by then, but still not beaten (thankfully that's what I rock). The industry needs the 1080ti to shift down in price like the 980ti did, giving people at the mid tier access to 4k and 60fps gaming (which the 1080ti can mostly manage).

    Leave a comment:


  • duby229
    replied
    Originally posted by Gusar View Post
    Well, they do make GPUs already, so it's not like they're starting completely from scratch. Just slapping a huge bunch of their current GPU execution units on a dedicated card and adding a few gigs of GDDR would produce something nice already.

    Edit: Also, "Ravi"? Have you been watching too much iZombie? . Raja Koduri is the one Intel poached for AMD.
    That would be assinine

    Leave a comment:


  • Michael
    replied
    Originally posted by bridgman View Post




    AFAIK none of them (Intel, AMD, NVidia) require binary firmware blobs in the kernel.

    The blobs are hardware microcode images which are uploaded into the GPUs to control on-chip hardware.

    The uploading is done by kernel driver code simply because nothing else is allowed to touch the hardware, but saying the code is "in the kernel" suggests that it is executing there, which is not the case.
    Right, apologies if the statement came out as unclear.

    Leave a comment:


  • bridgman
    replied
    Originally posted by notanoob View Post
    If these discreet GPU's are going to be anything like the post haswell IGP's with the firmware blobs in kernel, then avoid them and buy something else.

    Originally posted by Michael View Post
    You do realize current AMD and NVIDIA GPUs also require binary firmware blobs in the kernel for hardware acceleration?
    AFAIK none of them (Intel, AMD, NVidia) require binary firmware blobs in the kernel.

    The blobs are hardware microcode images which are uploaded into the GPUs to control on-chip hardware.

    The uploading is done by kernel driver code simply because nothing else is allowed to touch the hardware, but saying the code is "in the kernel" suggests that it is executing there, which is not the case.

    Leave a comment:

Working...
X