Announcement

Collapse
No announcement yet.

Khronos ASTC: Royalty-Free Next-Gen Texture Compression

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    If it's sw decompression in the drivers, that renders texture compression as a net negative. Longer loading times + worse quality than uncompressed, and without the bw reduction of hw-supported schemes.

    Comment


    • #17
      ETC2 is a pretty ingenious algorithm. Somewhat better than S3TC, yet almost similarly complex. It uses a clever method to switch between different compression modes for each block, and is backwards compatible to ETC1!

      Comment


      • #18
        Originally posted by uid313 View Post
        I am not so sure about that. S2TC was designed especially to be compatible with S3TC.
        Right, so I meant that hopefully ASTC will be compatible as well so that this compression scheme can replace S3TC like S2TC can. Even if not compatible it should be no trouble to recompress the original textures with ASTC and then decode on the fly with the S3TC functions replaced by ASTC functions within the game code for example
        Last edited by DeepDayze; 08-06-2012, 10:57 PM.

        Comment


        • #19
          Originally posted by MaxToTheMax View Post
          So basically, this is great news ten years from now, but not terribly useful today.
          That's the same with all the graphics API updates.

          You can't even use OpenGL 3.2 today thanks to the geniuses at Intel's Windows driver group only supporting GL 3.1 on their D3D 10.1 hardware.

          Any new graphics tech coming out can be used by a programmer wanting to release actual products 5-10 years from now.

          Comment


          • #20
            Originally posted by DeepDayze View Post
            Right, so I meant that hopefully ASTC will be compatible as well so that this compression scheme can replace S3TC like S2TC can. Even if not compatible it should be no trouble to recompress the original textures with ASTC and then decode on the fly with the S3TC functions replaced by ASTC functions within the game code for example
            Based on what I know of the two algorithms, ASTC is totally different, except for the basic similarity of both being block-based. They have basically zero chance of being binary compatible. Transcoding to S3TC on the fly is doable, but you lose the image quality benefits of ASTC and introduce a dependency on the S3TC patents as well.
            Last edited by MaxToTheMax; 08-07-2012, 12:33 AM.

            Comment


            • #21
              Originally posted by MaxToTheMax View Post
              Based on what I know of the two algorithms, ASTC is totally different, except for the basic similarity of both being block-based. They have basically zero chance of being binary compatible. Transcoding to S3TC on the fly is doable, but you lose the image quality benefits of ASTC and introduce a dependency on the S3TC patents as well.
              Yeah, my understanding is that the S3TC patents are so generic that any transcoding done on them in that manner would still leave the patent violated - which is why a simple solution like that hasn't already been added into Mesa.

              However, ASTC is supposed to be useful for the same types of uses that S3TC is good for, so hopefully it will fully replace it soon enough. The S3TC patents should expire around 2018-19 or so, which is probably around when this will start being used commonplace anyway.

              Comment


              • #22
                so now that S3TC has an official replacement, maybe developers would be allowed to at least use S2TC by default.

                and wtf is "below 1bit"??? always on or always off???

                Comment


                • #23
                  Originally posted by jakubo View Post
                  and wtf is "below 1bit"??? always on or always off???
                  These compression algorithms work on large blocks of pixels, so it can (for example) encode a 12x12 block in 128 bits, so that if 144 pixels in 128 bits, so ~0.89 bits/pixel, but I doubt the quality will be acceptable for the majority of content at that bitrate.

                  ASTC is interesting in that it can change the size of the pixel block it uses 128 bits to describe to get different effective bitrates, allowing a fair bit of fine-tuning based on the content being compressed.

                  Comment


                  • #24
                    ARM has released an eval ASTC coder/decoder! Joy!

                    http://www.malideveloper.com/develop...compressor.php

                    Comment


                    • #25
                      I have a question about how games approach this;
                      Do they compress the texture beforehand, or compress it live (when you load the game)?
                      The latter might sound silly, but the main point is to save video memory right? If so, then you would be able to choose compression method.
                      (Although for a mobile platform, hard disk space might be a bit of an issue as well)


                      Also, I'm tempted to write an export/import plugin for ASTC in GIMP.
                      Last edited by Micket; 08-07-2012, 10:04 AM.

                      Comment


                      • #26
                        ASTC for GIMP would be awesome!

                        Both approaches are often used. I believe on desktop systems it's most common to have the graphics driver encode it for you on the fly, since it makes the data files of the game easier to work with.

                        Comment


                        • #27
                          Originally posted by MaxToTheMax View Post
                          Based on what I know of the two algorithms, ASTC is totally different, except for the basic similarity of both being block-based. They have basically zero chance of being binary compatible. Transcoding to S3TC on the fly is doable, but you lose the image quality benefits of ASTC and introduce a dependency on the S3TC patents as well.
                          Ahh, so most likely then games (and drivers) will have to add support for ASTC natively such as detecting the compression libraries installed and prompting user to select one as part of initial game configuration

                          Comment


                          • #28
                            Yeah, something like that.

                            Comment


                            • #29
                              So. The license for the encoder/decoder released by ARM was nonfree, so I suppose one has to start by doing a free implementation.
                              How tedious.

                              Comment


                              • #30
                                Originally posted by Micket View Post
                                I have a question about how games approach this;
                                Do they compress the texture beforehand, or compress it live (when you load the game)?
                                The /vast/ majority of (mobile, at least) games are compressed beforehand, so the artists can decide on what compression methods/modes are used and they all look acceptable for what they are used for.

                                There is some interesting work on doing extra compression on an already compressed texture (As the compressed textures used in a graphics card need to be able to decompress a block independently, so they can't compress using repetition or patterns between blocks) such as Crunch (http://code.google.com/p/crunch/), and I believe that civ5 tries to use cuda to decompress from something like jpeg to dxt on the graphics card as the bottleneck is getting it from cpu memory to the graphics card memory. On mobile this is a bit different, as the cpu and gpu tend to share memory and bandwidth.

                                Comment

                                Working...
                                X