Announcement

Collapse
No announcement yet.

Khronos ASTC: Royalty-Free Next-Gen Texture Compression

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by MaxToTheMax View Post
    Based on what I know of the two algorithms, ASTC is totally different, except for the basic similarity of both being block-based. They have basically zero chance of being binary compatible. Transcoding to S3TC on the fly is doable, but you lose the image quality benefits of ASTC and introduce a dependency on the S3TC patents as well.
    Yeah, my understanding is that the S3TC patents are so generic that any transcoding done on them in that manner would still leave the patent violated - which is why a simple solution like that hasn't already been added into Mesa.

    However, ASTC is supposed to be useful for the same types of uses that S3TC is good for, so hopefully it will fully replace it soon enough. The S3TC patents should expire around 2018-19 or so, which is probably around when this will start being used commonplace anyway.

    Comment


    • #22
      so now that S3TC has an official replacement, maybe developers would be allowed to at least use S2TC by default.

      and wtf is "below 1bit"??? always on or always off???

      Comment


      • #23
        Originally posted by jakubo View Post
        and wtf is "below 1bit"??? always on or always off???
        These compression algorithms work on large blocks of pixels, so it can (for example) encode a 12x12 block in 128 bits, so that if 144 pixels in 128 bits, so ~0.89 bits/pixel, but I doubt the quality will be acceptable for the majority of content at that bitrate.

        ASTC is interesting in that it can change the size of the pixel block it uses 128 bits to describe to get different effective bitrates, allowing a fair bit of fine-tuning based on the content being compressed.

        Comment


        • #24
          ARM has released an eval ASTC coder/decoder! Joy!

          Comment


          • #25
            I have a question about how games approach this;
            Do they compress the texture beforehand, or compress it live (when you load the game)?
            The latter might sound silly, but the main point is to save video memory right? If so, then you would be able to choose compression method.
            (Although for a mobile platform, hard disk space might be a bit of an issue as well)


            Also, I'm tempted to write an export/import plugin for ASTC in GIMP.
            Last edited by Micket; 07 August 2012, 10:04 AM.

            Comment


            • #26
              ASTC for GIMP would be awesome!

              Both approaches are often used. I believe on desktop systems it's most common to have the graphics driver encode it for you on the fly, since it makes the data files of the game easier to work with.

              Comment


              • #27
                Originally posted by MaxToTheMax View Post
                Based on what I know of the two algorithms, ASTC is totally different, except for the basic similarity of both being block-based. They have basically zero chance of being binary compatible. Transcoding to S3TC on the fly is doable, but you lose the image quality benefits of ASTC and introduce a dependency on the S3TC patents as well.
                Ahh, so most likely then games (and drivers) will have to add support for ASTC natively such as detecting the compression libraries installed and prompting user to select one as part of initial game configuration

                Comment


                • #28
                  Yeah, something like that.

                  Comment


                  • #29
                    So. The license for the encoder/decoder released by ARM was nonfree, so I suppose one has to start by doing a free implementation.
                    How tedious.

                    Comment


                    • #30
                      Originally posted by Micket View Post
                      I have a question about how games approach this;
                      Do they compress the texture beforehand, or compress it live (when you load the game)?
                      The /vast/ majority of (mobile, at least) games are compressed beforehand, so the artists can decide on what compression methods/modes are used and they all look acceptable for what they are used for.

                      There is some interesting work on doing extra compression on an already compressed texture (As the compressed textures used in a graphics card need to be able to decompress a block independently, so they can't compress using repetition or patterns between blocks) such as Crunch (http://code.google.com/p/crunch/), and I believe that civ5 tries to use cuda to decompress from something like jpeg to dxt on the graphics card as the bottleneck is getting it from cpu memory to the graphics card memory. On mobile this is a bit different, as the cpu and gpu tend to share memory and bandwidth.

                      Comment

                      Working...
                      X