Announcement

Collapse
No announcement yet.

Intel Sends Out Patches Bringing Up The "DG1" Graphics Card Under Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Intel Sends Out Patches Bringing Up The "DG1" Graphics Card Under Linux

    Phoronix: Intel Sends Out Patches Bringing Up The "DG1" Graphics Card Under Linux

    For months now Intel's open-source driver developers have been working on the "Gen12" graphics support needed most notably for Tiger Lake and more recently is also confirmed for Rocket Lake. But Gen12 is also needed for the highly anticipated Xe Graphics with the discrete graphics offerings to come in the months ahead by Intel. Building off the existing Gen12 graphics driver code, Intel today published the first DG1 patches for enabling their first discrete graphics card under Linux...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    for the highly anticipated Xe Graphics
    I'm not sure there's really much "high anticipation" here beyond Intel's marketing hype train. Most comments are a groan followed by "again?". More like "we're cautiously waiting to see the performance reports of actual hardware on actual real world tasks. Anyone else remember i740 and the other canceled discrete GPU projects?"

    Comment


    • #3
      The single big oops aside, intel's gpu support has been quite solid for at least the past decade, just lacking in oomph. I'm waiting for someone to come up with an m.2 gpu that can use up to 25W in it's usual mode, and an external enclosure that can boost it to 75W over TB, to create a practical eGPU for mobile workstations.

      Comment


      • #4
        Originally posted by stormcrow View Post

        I'm not sure there's really much "high anticipation" here beyond Intel's marketing hype train. Most comments are a groan followed by "again?". More like "we're cautiously waiting to see the performance reports of actual hardware on actual real world tasks. Anyone else remember i740 and the other canceled discrete GPU projects?"
        I'm kind of excited about Intel making a discrete GPU. They're a company with plenty of resources to dump into the software ecosystem and they favor a more open approach so we might see real competition to Nvidia's CUDA and cuDNN from open source frameworks. Right now Nvidia has a stranglehold on machine learning and anyone serious about trying to train neural networks are forced to buy Nvidia.

        Comment


        • #5
          Originally posted by stormcrow View Post
          I'm not sure there's really much "high anticipation" here beyond Intel's marketing hype train. Most comments are a groan followed by "again?". More like "we're cautiously waiting to see the performance reports of actual hardware on actual real world tasks. Anyone else remember i740 and the other canceled discrete GPU projects?"
          That + a bit of "remember Larrabee / Xeon Phi" vibe for those who had to deal with that.

          Huge hype, enormous delays, big developer pressure (in my scientific computing community) to adapt code to the hardware as soon as it was out even though it was massively subpar compared to then-current GPUs, and in the end it was basically thrown out of the window after the second generation...

          Or maybe we can talk about 3D XPoint, which was claimed to be almost as fast as DRAM ("1000x NAND") in its initial announcement, and ended up being certainly quite a bit better than the current-gen NAND at the time where it was released, but not worthy of anywhere near that level of bragging.

          Then Xeon + FPGA was the future, or maybe it was another kind of FPGA-ish reconfigurable architecture that mimicked compiler IR structure more closely, and now apparently back to GPUs we go.

          The amount of super-ambitious R&D tracks that Intel seem to be trying to pull off at the same time, combined with their poor recent track record at producing good finished products (starting with their 10nm CPU process), is not particularly confidence-inspiring.

          If they get the thing to ship in a state that works as advertised for once, though, it's interesting for sure.
          Last edited by HadrienG; 21 May 2020, 02:52 AM.

          Comment


          • #6
            3D XPoint was and still is MASSIVE. This is no Xeon Phi...
            ## VGA ##
            AMD: X1950XTX, HD3870, HD5870
            Intel: GMA45, HD3000 (Core i5 2500K)

            Comment


            • #7
              DG1 is mainly a SDV (software dev vehicle) product. Given Tigerlake mobile will basically integrate the same GPU in a 15W package there is absolutely no reason to release a discrete graphics card on the market that probably no one will buy. DG1 sole purpose is to bring up the platform, create drivers, optimize them with game engines, so on an so forth. The real product will be DG2, which has yet to be announced, but rumours are it will be a full line-up top to bottom.

              Comment


              • #8
                I too am feeling pretty hopeful of Intel's new GPU. I don't in the slightest expect a 2080Ti killer, or even a 2070 killer. Intel's first-gen hardware seems rather mainstream in specs, but it's just the beginning. People take for granted how absurdly complicated GPUs are. If Intel manages to outperform AMD in performance-per-watt in this first generation, that is going to be very impressive.

                Comment


                • #9
                  Originally posted by Snaipersky View Post
                  The single big oops aside, intel's gpu support has been quite solid for at least the past decade, just lacking in oomph. I'm waiting for someone to come up with an m.2 gpu that can use up to 25W in it's usual mode, and an external enclosure that can boost it to 75W over TB, to create a practical eGPU for mobile workstations.
                  Single? Nope, this will be Intel's 3rd attempt to release a discrete GPU.

                  Comment


                  • #10
                    Originally posted by duby229 View Post

                    Single? Nope, this will be Intel's 3rd attempt to release a discrete GPU.
                    I was referring to the framebuffer exploit, the only making security flat I'm aware of for Intel GPUs.
                    ​​​​​​

                    Comment

                    Working...
                    X