Announcement

Collapse
No announcement yet.

The Experimental GCN 1.0 GPU Support Might Be Dropped From AMDGPU Linux Driver

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • The Experimental GCN 1.0 GPU Support Might Be Dropped From AMDGPU Linux Driver

    Phoronix: The Experimental GCN 1.0 GPU Support Might Be Dropped From AMDGPU Linux Driver

    By default the Linux kernel selects the aging Radeon DRM driver for GCN 1.0 "Southern Islands" and GCN 1.1 "Sea Islands" hardware (as well as all older ATI/AMD GPUs) while it's GCN 1.2 and newer that defaults to the modern AMDGPU kernel driver. But for years there has been experimental GCN 1.0/1.1 support available via kernel module options, but now for the original GCN GPUs that code is at risk of being dropped...

    http://www.phoronix.com/scan.php?pag...t-Drop-GCN-1.0

  • #2
    Shame

    Comment


    • #3
      Considering there are so many 7000, 8000, 200, 300 and 400 series cards that use GCN1 architectures, it's seriously asinine...

      https://en.wikipedia.org/wiki/List_o...HD_7000_series
      Seriously scroll down from here and count them...

      If it's a choice between accelerated video decode or Vulkan, I would most definitely choose Vulkan. It's what makes gaming on linux truly viable.

      EDIT: If that architecture didn't matter, then why release so many of them so many times, generation after generation? No doubt at all GCN1 is still being sold brand new even right now as we speak. Asinine...

      https://www.merriam-webster.com/dictionary/asinine
      Yes AMD, this situation fits both definitions.
      Last edited by duby229; 12-31-2019, 02:18 AM.

      Comment


      • #4
        I'd rather leave it experimental for eternity than drop it at all. As said, Vulkan beats UVD.

        Comment


        • #5
          Ugh, this is terrible support by AMD. They could just get off their asses, release the firmware for SI, then allow community contributors to fix up the driver to work with GCN 1.0 hardware. But no, they won't do that.

          As I've said several times before, AMD's Linux support is still shit compared to Intel's, and it will bite them in the ass.

          Comment


          • #6
            Originally posted by sandy8925 View Post
            Ugh, this is terrible support by AMD. They could just get off their asses, release the firmware for SI, then allow community contributors to fix up the driver to work with GCN 1.0 hardware. But no, they won't do that.

            As I've said several times before, AMD's Linux support is still shit compared to Intel's, and it will bite them in the ass.
            I'm not so sure that's true. Look at how long it took Intel to adopt Gallium3d based on the excuse that their classic drivers were so well optimized. Now that there actually is a gallium driver we know for sure that was bullshit.

            Comment


            • #7
              This makes me question my tentative plans to buy AMD when my GeForce GTX750 dies.

              Comment


              • #8
                Originally posted by ssokolow View Post
                This makes me question my tentative plans to buy AMD when my GeForce GTX750 dies.
                The trick really is in buying hardware that you already know is already well supported. If you were to buy any Vega right now, you would be extremely happy with its feature set, maybe not so much with its power efficiency, but it is extremely well supported right now. A similar situation will be true when you decide to buy, you just have to buy something at that time that is already supported well at that time. And if you want high performance and open source then it will definitely be something from AMD.

                Comment


                • #9
                  Originally posted by ssokolow View Post
                  This makes me question my tentative plans to buy AMD when my GeForce GTX750 dies.
                  IMHO just go for Intel Xe's dGPU when they launch a powerful enough version. It'll work out of the box from day 1, and all features will be supported and actually working (OpenCL, video decoding and encoding etc.).

                  Comment


                  • #10
                    Originally posted by duby229 View Post

                    I'm not so sure that's true. Look at how long it took Intel to adopt Gallium3d based on the excuse that their classic drivers were so well optimized. Now that there actually is a gallium driver we know for sure that was bullshit.
                    Sure, but their drivers were still stable, performant enough and were feature complete (yes yes newer OpenGL versions and whatnot took some time, but basic and even advanced OpenGL was working from day 1).

                    Comment

                    Working...
                    X