No announcement yet.

How to tell if a driver is gallium or just mesa? (Slow renderng with radeon)

  • Filter
  • Time
  • Show
Clear All
new posts

  • How to tell if a driver is gallium or just mesa? (Slow renderng with radeon)

    Dear Phoronix members!

    I have changed my linux distro from ubuntu 16.04 to the latest arch32 and installed xf86-video-ati and mesa packages. Direct rendering is enabled, but 3D performance is slower than before. Window manager is the same as before, but I have no display manager now and just do a startx to start dwm so generally the system is sometimes even faster -except for 3D.

    The glxgears reports 50-60 FPS. Extreme Tux Racer lags a bit (especially in the menu) and there is an interesting thing that glxinfo does not report "Gallium 0.4 on ....". Earlier it said Gallium 0.4 on llvmpipe or Gallium 0.4 on Ati ... whatever. Both for software and hardware rendering.

    Do I miss some things on my system or just the strings have changed to not say this??? Please someone who knows it at least tell if this is a change in the strings but everything is the same or if this really indicates there is no gallium for some reason here...

    See these outputs:

    [[email protected] examples]$ glxinfo | grep Open
    OpenGL vendor string: X.Org R300 Project
    OpenGL renderer string: ATI RC410
    OpenGL version string: 2.1 Mesa 19.0.3
    OpenGL shading language version string: 1.20
    OpenGL extensions:
    OpenGL ES profile version string: OpenGL ES 2.0 Mesa 19.0.3
    OpenGL ES profile shading language version string: OpenGL ES GLSL ES 1.0.16
    OpenGL ES profile extensions:

    [[email protected] ~]$ export LIBGL_ALWAYS_SOFTWARE=1; glxinfo | grep Open
    OpenGL vendor string: VMware, Inc.
    OpenGL renderer string: llvmpipe (LLVM 8.0, 128 bits)
    OpenGL core profile version string: 3.3 (Core Profile) Mesa 19.0.3
    OpenGL core profile shading language version string: 3.30
    OpenGL core profile context flags: (none)
    OpenGL core profile profile mask: core profile
    OpenGL core profile extensions:
    OpenGL version string: 3.1 Mesa 19.0.3
    OpenGL shading language version string: 1.40
    OpenGL context flags: (none)
    OpenGL extensions:
    OpenGL ES profile version string: OpenGL ES 3.0 Mesa 19.0.3
    OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00
    OpenGL ES profile extensions:
    My GPU is this one (lshw indicates and r300 kind of card):

                    description: VGA compatible controller
                    product: RC410M [Mobility Radeon Xpress 200M]
                    vendor: Advanced Micro Devices, Inc. [AMD/ATI]
                    physical id: 5
                    bus info: [email protected]:01:05.0
                    version: 00
                    width: 32 bits
                    clock: 66MHz
                    capabilities: vga_controller bus_master cap_list rom
                    configuration: driver=radeon latency=64 mingnt=8
                    resources: irq:17 memory:c0000000-cfffffff ioport:9800(size=256) memory:fe1f0000-fe1fffff memory:c0000-dffff
    I am aware that there used to be an r300 classic and an r300 gallium driver, but it was long ago. Is it possible that this is the classic one??? How can I know which it is?

    Maybe I am completely missing something instead, but I see the perfomance was better earlier...

  • #2
    Use Debian testing/Sid, that is easy to use as ubuntu and familiar to you.

    You can upgrade Buster to Sid by adding the Sid repository.


    • #3
      But I prefer to know what is the issue and why there is a visible slowdown while having accelerated 3D still. I don't think it is a distro issue or can it be?

      After all my xorg.conf hackz (same as on the earlier machine), a "high" power profile, removing spectre and meltdown mitigations using kernel parameters I get 300-350 FPS in glxgears, but still there is a very much visible performance drop. UrT is unplayable now while it ran without any issue before. Nexus: The Jupiter incident was also playable before. I also see some minor degradation with 2D games using opengl, but other than that actually the system is faster generally with less memory and CPU usage...

      When I changed the distro I have also changed to newer kernel and xorg versions. Can it be that they just grow to be 50% to 100% slower on an r300 gpu, single core machine???

      Also I think maybe the driver is gallium as I can see
      export GALLIUM_HUD="cpu,fps;primitives-generated"
      working properly.

      Btw I think I got around the new distro and things are up and running. Everything is nice except I cannot tell why 3D performance became so much degraded on this card after running with newer driver, xorg, kernel... I thought these parts are mostly unchanged.


      • #4
        PS.: Except the "primitives-generated" part is not working
        Ps2.: I have mentioned UrT because it is not a wine game, but runs natively so it is a better measure. With wine games Mount & Blade: warband was going without any stuttering on this very machine on the earlier setup albeit on low graphics of course. Still it is a quite recent game came to life after the machine. Now it is also unplayable. Generically everything that is 3D is performing bad, but the driver seems to be up and running... Just interested in the cause.


        • #5
          Disable dri 3, removed unused systemd services, use a non debug 1000 Hz timer kernel.


          • #6
            Originally posted by debianxfce View Post
            Disable dri 3, removed unused systemd services, use a non debug 1000 Hz timer kernel.
            DRI3 was already disabled on this card as it runs dri2 by default, but I actually added the line to xorg.conf just to be sure. Barely there is any unused systemd services. 50-70Mb memory usage and practically zero CPU when system is idle and using X + DWM. The current kernel timer I have checked in:

            less /proc/config.gz
            ^^ and as far as I tell I am running with a 300hz kernel, see:

            # CONFIG_HZ_100 is not set
            # CONFIG_HZ_250 is not set
            # CONFIG_HZ_1000 is not set
            I see why it can be a good idea to change for an 1000hz one, but I have no idea how much it might help. Thanks for the suggestion though! This is something I was not aware of at least ;-)

            I the meantime I am trying to put together a secondary system with a very old 2.6 kernel because then I might even have fglrx drivers too. That will be not a main system, it will just sit on my laptops internal sd card reader... if this succeed I might be able to kexec into that - but maybe I cannot even make it work with that old stuff and I prefer to have the open source driver speed sorted out as it served so well for years. ;-)


            • #7
              Originally posted by prenex View Post
              I see why it can be a good idea to change for an 1000hz one, but I have no idea how much it might help. Thanks for the suggestion though! This is something I was not aware of at least ;-)
              You need use every possible optimization ,because you have one core only. Optimizing the kernel performance has a huge effect, for example a non debug kernel boots many seconds faster than a debug kernel. The 1000Hz timer makes the system to react events faster. I have disabled swap, cpu freq governor, slow security features, everything that slows down and is not needed. Read the kernel config documentation when configuring with make gconfig or make xconfig. You need to install some gtk and qt development packages to make these editors to work.
              Last edited by debianxfce; 05-19-2019, 03:53 AM.


              • #8
                I have tried with zen-kernel because I could get it via pacman easily and it is supposed to be a desktop kernel, but looking at /proc/config.gz it still has 300 HZ. No noticable difference so far with zen-kernel - maybe around 0.5 to 1FPS plus in extreme tux racer so it runs in 11-17 FPS instead of 10-15 haha. Both much more worse than before when I still had my ubuntu 16.04...

                Now I will try to compile my own zen-kernel and somehow set the value for 1000 HZ. I have no idea though what you mean by "debug kernel" and what I need to turn off for not having those debug things.

                Btw I am actually thinking that maybe it is not even the timer_hz, but an other parameter that counts:


                If I understand these well, with the latter enables the scheduler to preemt tasks that are right in the middle of a syscall while the earlier only enables it "voluntarily" on predefined points. I cannot check it myself now as I am not having my earlier system anymore, but according to internet forums the earlier is the default on ubuntu generic kernels and I see the latter in my arch kernel.

                I will try the 1000 HZ settings alongside setting the voluntary flag if I can compile my kernel... but I am not really sure this is a kernel problem. Cannot it be that this driver is just not the same and the older radeon driver before gallium or that is impossible? I am still not really given up on the idea that there is something of a bigger issue here...


                • #9
                  This is the last point I can track back on the internet that still names two r300 drivers:


                  I really have no idea though how I could maybe have r300 classic and not r300g or if that is possible at all. Maybe in the end I will even need to compile mesa myself just to understand it better or whatever :-).

                  Am I missing something or only these count?
                  - The kernel (version) and its config
                  - The radeon driver (version) and if it is r300 or r300g and if it has changed lately or not
                  - The Xorg (version)
                  - Things that run on my system alongside me trying to run a 3D app

                  I have no display manager or any further layers, so I just issue startx and enter dwm from there so there shouldn't be much overhead. I quess display managers and window environments only take resources but not optimize things...


                  • #10
                    oh and of course config files count too, like xorg.conf...