Announcement

Collapse
No announcement yet.

RadeonSI Primitive Culling Lands In Mesa 19.2

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • RadeonSI Primitive Culling Lands In Mesa 19.2

    Phoronix: RadeonSI Primitive Culling Lands In Mesa 19.2

    The past few months AMD's Marek Olšák has been working on primitive culling support for the RadeonSI Gallium3D driver and last week that code was merged into the Mesa 19.2 development code...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Most games or their engines already does 'culling' on the software side. In real world scenario there should not be major improvements in performance
    But nice to have it!

    Btw ehat is different wirh cullling in asinc vs vertex shaders?

    Comment


    • #3
      And break Vega10 and RX chips.

      Comment


      • #4
        Code:
        + /* TODO: enable this after the GDS kernel memory management is fixed */
        + bool enable_on_pro_graphics_by_default = false;
        +
        + if (sctx->screen->debug_flags & DBG(ALWAYS_PD) ||
        + sctx->screen->debug_flags & DBG(PD) ||
        + (enable_on_pro_graphics_by_default &&
        + sctx->screen->info.is_pro_graphics &&
        + (sctx->family == CHIP_BONAIRE ||
        + sctx->family == CHIP_HAWAII ||
        + sctx->family == CHIP_TONGA ||
        + sctx->family == CHIP_FIJI ||
        + sctx->family == CHIP_POLARIS10 ||
        + sctx->family == CHIP_POLARIS11 ||
        + sctx->family == CHIP_VEGA10 ||
        + sctx->family == CHIP_VEGA20)))
        If I read this right, it is disabled by default?

        Comment


        • #5
          Originally posted by gsedej View Post
          Most games or their engines already does 'culling' on the software side. In real world scenario there should not be major improvements in performance
          In software you generally can not afford to do this kind of fine-grained culling, especially with very large single geometries, and rapidly-changing transforms, which are common in ParaView.
          Last edited by microcode; 20 May 2019, 11:45 AM.

          Comment


          • #6
            Originally posted by ihatemichael
            I wonder if gaming is the only thing that matters these days, I see them doing nothing to fix the radeonsi+glamor bugs.

            Very disappointing.
            I don't think there would be much of a GPU market if video games weren't this big. It's pretty much the thing that allows the general public to have reasonably powerful machines, and it's useful for work. If that wasn't the case, people who want something more powerful than a Facebook machine would need to pay extra for "workstations".

            Comment


            • #7
              Originally posted by ihatemichael

              None of that justifies the fact that RadeonSI is a buggy mess with GLAMOR.
              What do you mean by buggy with GLAMOR specifically? i mean i use archlinux with an RX470/HD7770(my 7950 died on me recently) with latest everything and AMDGPU DDX and so far haven't noticed anything wrong with the rendering(Gnome 3.32.2/Deepin).

              Do you mean this when using modesettings DDX? or is this recently? (i use Gnome wayland session almost exclusively this days)

              Comment


              • #8
                Originally posted by ihatemichael
                I wonder if gaming is the only thing that matters these days, I see them doing nothing to fix the radeonsi+glamor bugs.

                Very disappointing.
                Kind of an ironic statement, given that this doesn't help gaming and isn't even enabled on desktop cards. It seems to help the workstation market only.

                Comment


                • #9
                  Originally posted by ihatemichael

                  You can talk all day about these things but the fact is that RadeonSI+GLAMOR continues being buggy. Intel (i965+modesetting+glamor) gives me zero issues.
                  I've used GLAMOR for years on a variety of generations without encountering any issue...
                  Do you have them when using the proper ddx instead of modesetting?

                  Comment


                  • #10
                    Originally posted by ihatemichael

                    I get frequent artifacts (corruption, invisible objects, etc) with the modesetting driver.
                    mmm, there is a reason you don't want to use amdgpu DDX? just saying that one seem to work really well and i'm not entirely sure if modesettings is meant to work properly with AMDGPU since there is a bunch of features that it doesn't natively support either way like VRR, vblank timestamp queries(i think) and others and if i remember right it was meant as an last option for DDXless drivers not a all in one DDX replacement for all drivers.

                    update: posted before seeing your last comment, just ignore this one

                    Comment

                    Working...
                    X