Announcement

Collapse
No announcement yet.

Intel Confirms Their Discrete GPU Plans For 2020

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by leipero View Post
    I don't know how they calculate compute power for Tesla v100, but nvidia claims 7 to 7,8 TeraFLOPS of DP compute power, while they somehow got the number of 100+ TeraFLOPS for "deep learning".
    Deep learning needs less precision in calculations, I think they can get away with 16bit floating point operations instead of 32bit. This of course changes the FLOPS (floating point operations per second) as a 16bit operation is easier than a 32bit one.

    Comment


    • #32
      Originally posted by DMJC View Post
      Open source support is irrelevant. NVIDIA is available on Linux/Solaris/FreeBSD
      I've been to the year 3000, Not much has changed but they lived underwater. Oh and NVIDIA stopped producing drivers for Solaris and FreeBSD somewhere around 2025.
      Open-source is the only thing that is relevant when it comes to buying certain types of hardware IMO.

      (Crappy drivers certainly made fixing my time machine to get home a massive pain in the butt. I spent most of it patching around all the DRM that existed in the future. Oh and firmware blobs.... They didn't release the source code to them even 500 years after the hadware (and company) went obsolete. Those fsck(8)ers!)
      Last edited by kpedersen; 13 June 2018, 05:30 AM.

      Comment


      • #33
        But will those cards be able to game?
        Will it run Crysis?

        Comment


        • #34
          I'm sure Intel is going to target not only the gaming market but also (primarily?) the GPGPU market. But they can't use CUDA => Will we see great advancement in open OpenCL driver development?

          Comment


          • #35
            Originally posted by DMJC View Post
            Open source support is irrelevant. NVIDIA is available on Linux/Solaris/FreeBSD. Super computer makers already have the skills/tech to support NVIDIA deployments. Desktop market is owned by Microsoft and PC gaming is on Windows. Ultimately Linux desktop makes up at best 20-40 million machines. Don't kid yourself on how valuable Linux desktop support is to which GPU maker will win the market. I love Linux, I've been using it for 18 years. But ultimately it doesn't and has never decided who wins in GPU marketshare.
            Linux desktop is a valuable nice to support to GPU makers. VFX industry is mainly using linux desktops. This could be one of the main driving factors behind us having linux nvidia driver at all.

            Comment


            • #36
              Originally posted by kpedersen View Post

              I've been to the year 3000, Not much has changed but they lived underwater. Oh and NVIDIA stopped producing drivers for Solaris and FreeBSD somewhere around 2025.
              Open-source is the only thing that is relevant when it comes to buying certain types of hardware IMO.

              (Crappy drivers certainly made fixing my time machine to get home a massive pain in the butt. I spent most of it patching around all the DRM that existed in the future. Oh and firmware blobs.... They didn't release the source code to them even 500 years after the hadware (and company) went obsolete. Those fsck(8)ers!)
              Been there (and then) too, can confirm.

              Comment


              • #37
                Originally posted by starshipeleven View Post
                Nah, I could really use some sub-100$ GPUs to drive multiple screens so I could say "fuck it" to NVIDIA's murderously overpriced multihead cards for workstations.
                Did you even read the man's post? If you knew Intel's architecture you would agree, you clearly don't.

                Comment


                • #38
                  Originally posted by duby229 View Post
                  Did you even read the man's post? If you knew Intel's architecture you would agree, you clearly don't.
                  Did you even read the man's post? If you knew anything about GPUs you would agree, you clearly don't.

                  Main reason Intel GPUs are weak is because they don't have that much "execution units", which is done because they need to leave most of the chip's TDP to the CPU component. Seriously, the iGPU is using less than 10 W now, if they can keep adding them up to reaching 70 or even 160w it would change a lot.
                  Another reason is because they share system memory which is kinda meh for a GPU (dedicated GPUs use GDDR for a reason).

                  If you place "a huge bunch" of them and add "a few gigs" of GDDR then yeah, it would "produce something nice". It won't be top-tier, but they can go at least somewhere into midrange.

                  Comment


                  • #39
                    Frankly, what pixel shuffling silicon need these days is less power usage and Intel is great here. For example, they can squeeze a gpu that can drive two 4k displays plus 4 cpu cores into 6W TDP product. I'm looking forward to times when these kind of products will also be able to run some graphics stuff, not just a bunch of terminals

                    Comment


                    • #40
                      Originally posted by leipero View Post
                      PackRatI was refering to "desktop" market oriented GPU's, not "pro",.
                      Amd radeons do have some pro features like 10 bit color that would need a monitor that supports it and gpu pass-through with Nvidia would need a quadro. Nvidia's evil proprietary driver will block gpu pass-through with a geforce card.

                      I will assume here that Intel will have gpu pass-through that is not blocked like Nvidia's.

                      Comment

                      Working...
                      X