Announcement

Collapse
No announcement yet.

The Open-Source ATI R300 Graphics Driver Is Still Being Improved Upon In 2024

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Eirikr1848 View Post

    I have a POWER5 server in storage. POWER6 MAYBE? idk. That being said, a laptop is more realistic.

    I have an iBook with a 9550 I believe, which is RV3xx architecture. It's ready to go, in a box. (Someone claimed to be affiliated with the ArchLinuxPOWER project but turns out not to be the case)

    {The actual project page is: https://archlinuxpower.org/}

    I also have one with R200 hardware if you are a glutton for punishment and are in the mood for making an R200 Gallium driver. (Because, you know, it's oh-so-easy).
    Wow, that would be awesome, please contact me on my email pavel dot ondracka at gmail dot com and we can discuss details. You can also check my the profile page https://gitlab.freedesktop.org/ondracka and some older mesa r300 commits, like here https://gitlab.freedesktop.org/mesa/...e5d0bb70f28245 (before I got too lazy to add Signed-off-by) that this is the correct email and I'm indeed doing r300 development... :-)

    Regarding the R200, I'll leave this for another person, I'm not feeling adventurous enough ;-)

    Comment


    • #32
      Originally posted by Paulie889 View Post

      Wow, that would be awesome, please contact me on my email pavel dot ondracka at gmail dot com and we can discuss details. You can also check my the profile page https://gitlab.freedesktop.org/ondracka and some older mesa r300 commits, like here https://gitlab.freedesktop.org/mesa/...e5d0bb70f28245 (before I got too lazy to add Signed-off-by) that this is the correct email and I'm indeed doing r300 development... :-)

      Regarding the R200, I'll leave this for another person, I'm not feeling adventurous enough ;-)
      Email sent!

      Comment


      • #33
        Originally posted by YCbCr View Post

        So, out of curiosity I just tried this in a live ISO on an old HP laptop. Unfortunately I misremembered about the GPU being a Mobility Radeon 2300 (RV515), it's actually a Mobility Radeon 3400 series (RV620) and therefor uses R600g (OpenGL 3.3, OpenGL ES 3.0, no Vulkan).

        Anyways, Firefox defaulted to Software Webrender (apparently as a workaround for a bug with some ancient Mesa / R600 version, according to about:support) but I was able to force enable it (I think, about:support is rather confusing in this regard).

        Chromium 119 defaulted to the following:
        Works on my hardware too. I have exact same results as you. The culprit was wrong flag passed by me to the Chromium which probably changed its behavior recently. Sorry for the noise and thanks for checking on your HW.

        Comment


        • #34
          Originally posted by gentoofu View Post
          Let's try not equating TDP to power consumption, please.
          It is fair to not equate them directly, but lets not pretend a 37w/tdp chip is going to consume 400w like a RTX 4090 will.

          Yes there are less efficiencies, but total consumption has an obvious limit here. None of these cards to my memory has an additional power connection and the card pictured in the original article has no extra connector, which means their power consumption must remain below the threshold of what the slot itself will provide, which is IIRC 75w.

          Most of these cards have 60w consumption range. Being a little less efficient with only a 60w consuming chip is not exactly a deal breaker if the workload suits the chip.

          Comment


          • #35
            Originally posted by ezst036 View Post
            It is fair to not equate them directly, but lets not pretend a 37w/tdp chip is going to consume 400w like a RTX 4090 will.
            Not only that, but the idle power consumption of modern dGPUs is quite high - in many cases, higher than legacy GPUs. For desktop usage, this is very relevant.

            Comment


            • #36
              Originally posted by coder View Post
              Not only that, but the idle power consumption of modern dGPUs is quite high - in many cases, higher than legacy GPUs. For desktop usage, this is very relevant.
              It's not. For example, this is lm-sensors data of a AMD RX 5500 card running latest KDE Plasma:

              amdgpu-pci-2d00
              Adapter: PCI adapter
              vddgfx: 718.00 mV
              fan1: 0 RPM (min = 0 RPM, max = 3200 RPM)
              edge: +49.0°C (crit = +110.0°C, hyst = -273.1°C)
              (emerg = +115.0°C)
              junction: +49.0°C (crit = +105.0°C, hyst = -273.1°C)
              (emerg = +110.0°C)
              mem: +0.0°C (crit = +105.0°C, hyst = -273.1°C)
              (emerg = +110.0°C)
              PPT: 6.00 W (cap = 135.00 W)​

              If you're just running basic desktop apps, the card might even use less than old crap cards. New 7nm node is much more power efficient than old 150nm Radeon 9700.

              I'm surprised that I get even lower power usage readings with RX 5500 than older RX 550. The newer only consumes more when playing recent games. But of course those games won't run at playable speed on older cards anyway, so it's not a fair comparison.
              Last edited by caligula; 09 January 2024, 04:01 AM.

              Comment


              • #37
                Originally posted by NSLW View Post
                Not long ago I thought it's better to have an AMD/ATI than NVIDIA GPU long-term due to their open-source strategy allowing near endless HW drivers support, but if essential SW doesn't support the HW then there is little difference.
                Indeed, sometimes the hardware really cannot support modern usage. And this is especially true for older GPUs, with their weird architecture and limited programmability of the time.

                Granted, I don't see what functionality of Chrome would cause it to drop to software rendering entirely on an R300, it runs on old phones ffs, but I don't know much about its rendering stack.

                Comment


                • #38
                  Originally posted by caligula View Post
                  It's not. For example, this is lm-sensors data of a AMD RX 5500 card running latest KDE Plasma:
                  The RX 5500 is the smallest RDNA GPU. That's not what I was talking about.

                  Apparently, people with RX 7900 XT are getting as low as 17 W, but it goes up with more monitors & refresh rate, easily reaching 40-50 W.

                  Intel A770 is also idles in the ballpark of 45 W.

                  Nvidia's RTX 3080 tends to idle at more like 34 W.

                  Originally posted by caligula View Post
                  ​PPT: 6.00 W (cap = 135.00 W)​
                  Was yours running at PCIe 3.0 or 4.0?

                  Originally posted by caligula View Post
                  ​If you're just running basic desktop apps, the card might even use less than old crap cards. New 7nm node is much more power efficient than old 150nm Radeon 9700.
                  Modern graphics memory is much more power-hungry than what those old cards used. PCIe 4.0 also burns more power than legacy interfaces. Depending on how aggressively the GPU wakes up, when it gets work to do, the compute portions can burn more power than you'd expect, even on light workloads.

                  Comment


                  • #39
                    Originally posted by Paulie889 View Post
                    What is really broken is POWERPC support. :-( We have quite a lot of users with old Power Macs and there nothing really works. Unfortunately I don't have any hardware to debug, and I was not successful in fixing it just from the provided logs, so if someone has some working Power hw with R300-R500 GPU (or with a PCIe slot into which I could plug one) and would be willing to send it, that would be much appreciated.
                    First of all let me tell you that you are a kind person.

                    I cannot give you my iBook G4, sorry it's too important to me (even if I don't use it) and also the card is a R200 hardware (ATI Mobility Radeon 9200).
                    Still I suppose that since much distro have ended support for 32bit PowerPC (Debian for example), a fix would be less useful today.
                    But the simple fact that you wish to do that on 2024 give me hope for the humanity, read your post have changed my day, thanks.

                    Comment


                    • #40
                      Originally posted by coder View Post
                      The RX 5500 is the smallest RDNA GPU. That's not what I was talking about.
                      So what? It makes absolutely no sense to compare most powerful modern cards with some museum hardware. Even RX 5500 is really powerful compared to high end cards published few years earlier. According to techpowerup this (RX 5500 XT is the exact model) is faster than AMD Radeon R9 290X, a 295W TDP card.

                      Was yours running at PCIe 3.0 or 4.0?
                      I think 4.0. It's a B550M motherboard.

                      Modern graphics memory is much more power-hungry than what those old cards used. PCIe 4.0 also burns more power than legacy interfaces. Depending on how aggressively the GPU wakes up, when it gets work to do, the compute portions can burn more power than you'd expect, even on light workloads.
                      I've had the same case and same level of ventilation for years. Haven't seen any increase in temperature.

                      Comment

                      Working...
                      X