Announcement

Collapse
No announcement yet.

Intel Removes ASTC Hardware From Gen12.5+ Graphics

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Intel Removes ASTC Hardware From Gen12.5+ Graphics

    Phoronix: Intel Removes ASTC Hardware From Gen12.5+ Graphics

    Somewhat of a surprising change with Intel Gen12.5 graphics is that they have removed the hardware supporting Adaptive Scalable Texture Compression (ASTC). Intel's Linux graphics driver has now been updated to address Gen12.5+ foregoing hardware support for ASTC texture compression...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    i know only switch emulators use this

    Comment


    • #3
      posed compilations
      I presume this is meant to be "posed complications".
      Last edited by Imroy; 07 October 2021, 07:03 AM. Reason: spelling :P

      Comment


      • #4
        On ANV, this means we need to disable support for all the ASTC formats as well as disable the textureCompressionASTC_LDR feature bit.
        So what happens then when a Vulkan renderer needs to decompress ASTC textures on affected Intel GPUs?
        Is there a seamless fallback option available by default or developers need to incorporate a special workaround specific to ANV?

        Comment


        • #5
          Admit it Intel, you just forgot to implement it, didn't you?

          Comment


          • #6
            It's weird... ASTC is quite interesting. What happened? Is it too complex to implement and the classic DXT etc. formats OTOH are still considered more than "good enough"? AS far as I can tell, neithet AMD nor Nvidia ever implemented ASTC in hardware.

            Comment


            • #7
              Originally posted by Linuxxx View Post
              So what happens then when a Vulkan renderer needs to decompress ASTC textures on affected Intel GPUs?
              Is there a seamless fallback option available by default or developers need to incorporate a special workaround specific to ANV?
              Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

              On Linux at least you will fall back on cpu decompress automatically.

              Originally posted by willmore View Post
              Admit it Intel, you just forgot to implement it, didn't you?
              We cannot be too sure of this. Something worst has happened.

              Originally posted by brent View Post
              It's weird... ASTC is quite interesting. What happened? Is it too complex to implement and the classic DXT etc. formats OTOH are still considered more than "good enough"? AS far as I can tell, neithet AMD nor Nvidia ever implemented ASTC in hardware.

              The difference between astc and the other options is quite small. I am not sure if up-scaling that coming a common thing will make these differences more important or less important.

              What happened is horrible here.


              Yes DXT the original problem was that it was patented.
              This is the official repository for the Arm ASTC Encoder, a texture compressor for the Adaptive Scalable Texture Compression data format. - ARM-software/astc-encoder


              astc problem is patent pending from arm as far as I know. This has never been cleared up.



              Yes the fact AMD never implemented ASTC in hardware is very problematic. Basically AMD was one of the two parties who paid for the development of ASTC and ARM was the other and ARM basically stabbed AMD in the back with a few patent applications. Yes ARM sold licenses for their graphics IP to Intel and Imagination PowerVR and anyone doing a arm based cpu.

              There was a 2012 presentation with ASTC at that presentation there was a AMD development board with ASTC in hardware and that was before AMD found about about the patent back-stab. So its not true that AMD has never ever implemented ASTC in hardware. AMD has never at this stage implemented ASTC in hardware released to the customers and we will most likely be waiting to 2023 if AMD ever does.

              Basically we cannot have good all around hardware texture compression and decompression because parties keep on patenting stuff. At least these days we can have decent texture compressing using methods that the patents are expired.



              Comment


              • #8
                Before misinformation spreads:
                • ASTC was crazy expensive to manufacture. All the block modes meant a lot of dedicated surface area was needed to support it. A bummer because only supporting a fraction of these modes (i.e. the most popular ones) could've made the format more commercially viable. This is the main reason AMD and NVIDIA didn't bother supporting it on desktop.
                • BC7 has similar quality and storage needs as ASTC at the highest quality, while BC1 & BC4/5 have can replace ASTC at the intermediate qualities. BCn compression modes are much more simple thus cheaper to manufacture but are patented (ok recently patents for BC1-5 expired). But If you target DirectX, you already must support BC1 through BC7, so it's better to support BCn than ASTC
                • If you're manufacturing for Android, iOS (or Switch) you won't be supporting DirectX at all so you don't have Microsoft to pay for the bill, and nobody wants to pay the BCn patent fees so they implement ASTC instead. Which is why ASTC is so popular on Mobile. However now that BC1-5 patents expired, it's a good question what will happen when looking forward.

                TL;DR ASTC is good but costly (in terms of $ and space. And space problems translate to heat and power issues, which ironically is what compression tries to solve). If Khronos wants ASTC to have a rock solid future, perhaps it should start looking into making a survey of popular formats to only support those in HW.

                Comment


                • #9
                  Originally posted by brent View Post
                  It's weird... ASTC is quite interesting. What happened? Is it too complex to implement and the classic DXT etc. formats OTOH are still considered more than "good enough"? AS far as I can tell, neithet AMD nor Nvidia ever implemented ASTC in hardware.
                  Nvidia did implement ASTC in at least Tegra X1, an older SoC also used by the Nintendo Switch.
                  Besides some spooky patent situation the industry collectively screwed themselves over.
                  ASTC's flexibility came at an incredibly high cost to implement in hardware, which was overlooked for too long.

                  So ASTC is unfortunately not a practical successor to current texture compression schemes.
                  If better quality is required, we need a new standard with a much better quality per transistor ratio.

                  Comment


                  • #10
                    Ah I see, it's more or less what I expected. I looked at the ASTC spec a few years ago and thought: holy, this stuff is complex!

                    ASTC also seems to have a lot of useless stuff. I mean, what's the point of these countless bit rates? Quality is absolutely useless for the modes with < 2 bpp and in practice no one is going to tune compression to the point of making fine-grained decisions between different bit rates anyway. Texture compression is already slow enough as-is...

                    Comment

                    Working...
                    X