Announcement

Collapse
No announcement yet.

Intel Arc Graphics Driver Change Leads To A Big Speed-Up Under Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Intel Arc Graphics Driver Change Leads To A Big Speed-Up Under Linux

    Phoronix: Intel Arc Graphics Driver Change Leads To A Big Speed-Up Under Linux

    With the latest Mesa 23.2 code as of Friday there is now a rather significant performance optimization for Intel's graphics driver stack that really helps out Intel Arc Graphics DG2/Alchemist along with upcoming Meteor Lake graphics. Counter-Strike: Global Offensive, for example, was found to be 11% faster now with this single driver change and other Vulkan apps/games benefiting as well...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    I really want ARC to do well, due to how awful the GPU market is, but I sense Intel has mostly given up and their GPUs will be AI only in a couple of years.

    doesn’t help with how game development is basically broken, with nvidia and amd basically shipping game patches in their drivers

    Comment


    • #3
      When will the next Arc GPU Windows 11 vs. Linux Benchmarks take place?

      Comment


      • #4
        Originally posted by Britoid View Post
        I really want ARC to do well, due to how awful the GPU market is, but I sense Intel has mostly given up and their GPUs will be AI only in a couple of years.

        doesn’t help with how game development is basically broken, with nvidia and amd basically shipping game patches in their drivers
        There's a case to be made for consumer-level AI cards too though. Training one's own AI helper bot(s) as opposed to literally giving away one's own neural brain-graph to the GAFAMs. A 32GB or 64GB GPU with lots of low-precision streaming processors would certainly be on my list, and is entirely possible within consumer budget given N5 and N3 transistor densities.

        If competition works right (yes I get it, big IF), then we might even actually see this.
        Last edited by vegabook; 24 June 2023, 07:22 AM.

        Comment


        • #5
          Originally posted by vegabook View Post

          There's a case to be made for consumer-level AI cards too though. Training one's own AI helper bot(s) as opposed to literally giving away one's own neural brain-graph to the GAFAMs. A 32GB or 64GB GPU with lots of low-precision streaming processors would certainly be on my list, and is entirely possible within consumer budget given N5 and N3 transistor densities.

          If competition works right (yes I get it, big IF), then we might even actually see this.
          Are you another AI/ML statistics-on-steroids bubble victim?

          Comment


          • #6
            Originally posted by timofonic View Post

            Are you another AI/ML statistics-on-steroids bubble victim?
            No I've been doing (traditional) data science for a long time, and while I'm weary of the hype, sure, I do think these new models are interesting and innovative especially the transformer ("attention") architecture.

            My personal experience, for myself and others, was initial hostility to Copilot, for example. Now, 4 months in, I find it amazingly useful for boilerplate / mundane code blocks / quick-and-dirty "how to" hints, and I've seen others who were initially hostile, later begrudgingly, then sometimes even enthusiastically, buying in. Thing is 95% of the time if if I had had a model looking at my own personal coding history, most of this stuff would already be in my own personal model. Hence this idea of personal bot.

            I do get it though that people are immediately derogatory and defensive and there are good reasons for that too.
            Last edited by vegabook; 24 June 2023, 08:24 AM.

            Comment


            • #7
              Has the idle power usage issue been resolved yet? e.g. it not consuming 40W in idle?

              Comment


              • #8
                Michael is teasing us

                Comment


                • #9
                  DG2 needs VM Bind on i915 or Huc on XE, I get that both are a big commitment, don't buy a DG2 until one or the other gets done

                  Originally posted by Britoid View Post
                  I really want ARC to do well, due to how awful the GPU market is, but I sense Intel has mostly given up and their GPUs will be AI only in a couple of years.

                  doesn’t help with how game development is basically broken, with nvidia and amd basically shipping game patches in their drivers
                  did you lose an arm that you still feel too? no idea where you got that sense, but intel is supporting their DG2 gpus perfectly fine and there have been no indications of intel giving up outside of a couple less then reputable leakers. despite what I said above, DG2 is progressing quite nicely on my A380. I dont have any qualms about the work itself being done, just that intel is putting VM_BIND support for i915 on the back burner. and doesnt seem to be planning to add huc to XE anytime soon. these are both major features and without one driver having both leads to an incomplete package. that being said if you only need one or the other, things are pretty good.

                  Comment


                  • #10
                    Originally posted by emansom View Post
                    Has the idle power usage issue been resolved yet? e.g. it not consuming 40W in idle?
                    Your motherboard needs the appropriate ASPM settings exposed in BIOS/UEFI. You also can't have a fancy display or several not-so-fancy displays. I'm on triple 1440p@165 so I was SOL.

                    # of Displays Can Arc A-Series enter the low power state when idle?
                    1 Yes, up to 4k@60
                    2 Yes, up to 1080p@60
                    3 No
                    4 No

                    Comment

                    Working...
                    X