Intel Announces Arc B-Series "Battlemage" Discrete Graphics With Linux Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • pWe00Iri3e7Z9lHOX2Qx
    Senior Member
    • Jul 2020
    • 1608

    #41
    Originally posted by Quackdoc View Post

    Thankfully not an issue for us linux folk
    Performance with my short time A770 was a total shit show in Linux on old systems without ReBAR / SAM (even with above 4G decoding enabled). Windows performance was actually much better at the time. My impression was basically...
    • Windows: you really should have ReBAR enabled
    • Linux: you absolutely need ReBAR enabled

    ​I was hoping there would be memory controller upgrades to make these usable for gaming in old systems.

    Comment

    • Quackdoc
      Senior Member
      • Oct 2020
      • 5123

      #42
      Originally posted by pWe00Iri3e7Z9lHOX2Qx View Post

      Performance with my short time A770 was a total shit show in Linux on old systems without ReBAR / SAM (even with above 4G decoding enabled). Windows performance was actually much better at the time. My impression was basically...
      • Windows: you really should have ReBAR enabled
      • Linux: you absolutely need ReBAR enabled

      ​I was hoping there would be memory controller upgrades to make these usable for gaming in old systems.
      really? I am very much without rebar and have zero difference with above 4g decoding enabled, zero difference on any system I have tested it on, and am currently using it on a 2600 which doesn't support rebar in the first place (well without uefi modding)

      Comment

      • MillionToOne
        Phoronix Member
        • Aug 2024
        • 108

        #43
        Originally posted by Quackdoc View Post

        how so? my a380 now performs far better then my rx580 did on most games I play​ and while granted polaris is old now, it's still relatively competitive for the price range.
        My A770 16GB performs like shit in nearly all games using D3D12 under the hood. And native Vulkan. RDR2 for example under max settings without MSAA runs like 30% worse than under Windows with Vulkan. Not only that, but games Like GR Breakpoint refuse to launch, even though the game has DX11 and native Vulkan. Ray traced games don't work well, until recently there were shader bugs in Hitman 3, unless they fixed themselves, since they didn't always happen. Titles running under DXVK generally work, however, even then the performance is a fraction what it would be on Windows with DXVK.

        Comment

        • Quackdoc
          Senior Member
          • Oct 2020
          • 5123

          #44
          Originally posted by MillionToOne View Post

          My A770 16GB performs like shit in nearly all games using D3D12 under the hood. And native Vulkan. RDR2 for example under max settings without MSAA runs like 30% worse than under Windows with Vulkan. Not only that, but games Like GR Breakpoint refuse to launch, even though the game has DX11 and native Vulkan. Ray traced games don't work well, until recently there were shader bugs in Hitman 3, unless they fixed themselves, since they didn't always happen. Titles running under DXVK generally work, however, even then the performance is a fraction what it would be on Windows with DXVK.
          I don't use windows so I can't comment on that specifically. but linux dxvk hasn't been of particular concern to me, It may just be since I use an a380 which is a far lower tier, Im not hitting any obvious bottlenecks, but the linux experience for me has been solid. Outside of ghosts of tsushima still not working, every game more or less performs as I would expect.

          Comment

          • fintux
            Phoronix Member
            • Nov 2019
            • 54

            #45
            Originally posted by pioto View Post
            What's the purpose of sticking PCIe 5.0 support on the host side (since Alderlake), yet not making a GPU that would use it.
            What's the point of sticking PCIe 5.0 on the GPU when it's not capable of using the bandwidth? PCIe 5.0 doesn't automatically increase performance, it just adds bandwidth, and if that wasn't the bottleneck, it's only going to add to the cost of the GPU.

            Comment

            • coder
              Senior Member
              • Nov 2014
              • 8982

              #46
              Can anyone confirm whether it will have native fp64 support? Alchemist didn't, which I think they concluded was a mistake. Even though games don't use much fp64, it still gets enough use that having to emulate it can measurably affect performance.

              Comment

              • coder
                Senior Member
                • Nov 2014
                • 8982

                #47
                Originally posted by fintux View Post
                What's the point of sticking PCIe 5.0 on the GPU when it's not capable of using the bandwidth? PCIe 5.0 doesn't automatically increase performance, it just adds bandwidth, and if that wasn't the bottleneck, it's only going to add to the cost of the GPU.
                It reduces the latency of data transfers to GPU memory, which can improve performance in some games. For the typical game, the impact on average fps is quite low. You see a bigger impact on 1% minimums, especially when using a fast CPU & GPU combination. If you look at the relative impact graphs, the effect is almost the same at all 3 resolutions they tested:

                We take a second look at PCI-Express performance scaling of NVIDIA's GeForce RTX 4090 flagship card. This time with a Core i9-13900K, which hopefully helps us highlight more differences than what we've seen with Ryzen 5800X last year. We've also added minimum FPS and ray tracing testing.


                Here's the graph for 1440p:



                Another case we've seen where PCIe can be a real bottleneck is when a graphics card with too little memory is run at high settings. This increases the turnover rate of data in graphics memory, exacerbating whatever bottleneck the PCIe link might be having. The most well-known example of this was the RX 6500 XT, with its meager x4-lane connection, when run on a motherboard supporting only PCIe 3.0. When used in such a configuration, cranking up the settings caused a disproportionate drop in framerates, relative to cards either with more memory or more PCIe bandwidth.

                Comment

                • smitty3268
                  Senior Member
                  • Oct 2008
                  • 6970

                  #48
                  Originally posted by coder View Post
                  Can anyone confirm whether it will have native fp64 support? Alchemist didn't, which I think they concluded was a mistake.
                  Where did you see that?

                  I didn't see anything in the slides about fp64 support, and they were calling out some other new features granting increased performance, so I am pretty doubtful it's there.

                  Comment

                  • creoflux
                    Phoronix Member
                    • Dec 2019
                    • 72

                    #49
                    I look forward to seeing some compute workload benchmarks. I want faster llm responses. The game benchmarks are fun as maybe a data point for comparison but it would need to come with a bucket of spare time.

                    Comment

                    • Anux
                      Senior Member
                      • Nov 2021
                      • 1961

                      #50
                      Originally posted by coder View Post
                      Can anyone confirm whether it will have native fp64 support? Alchemist didn't, which I think they concluded was a mistake. Even though games don't use much fp64, it still gets enough use that having to emulate it can measurably affect performance.
                      I've seen it on the Intel slides, there you go.

                      Comment

                      Working...
                      X