Announcement

Collapse
No announcement yet.

Khronos ASTC: Royalty-Free Next-Gen Texture Compression

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Khronos ASTC: Royalty-Free Next-Gen Texture Compression

    Phoronix: Khronos ATSC: Royalty-Free Next-Gen Texture Compression

    SIGGRAPH LA 2012 is this week and expected this morning marked the release of OpenGL 4.3 and OpenGL ES 3.0, as talked about on Phoronix previously plus their details and new features will be mentioned in posts on Phoronix in the next few minutes. There's also one pleasant and very welcome surprise from the Khronos Group this morning: ATSC. ATSC is a royalty-free next-generation texture compression specification. With a bit of luck, hopefully ATSC will finally kick the patent-laden S3TC out the door...

    http://www.phoronix.com/vr.php?view=MTE1NDk

  • curaga
    replied
    Originally posted by Micket View Post
    Are GPU's without hardware support for ASTC limited to only decoding in advance and using a decompressed texture only?
    Could it no be implemented in software running in some shader? (at a cost of course, but I was under the impression that decoding ASTC would be fast (perhaps not?))
    That would only be viable on a high-end gpu. And on such a gpu you're not usually lacking vram or bw, so little reason to do so.

    Note that this is on general shader texture decompression. I have no idea whether it's feasible to even do ASTC in a shader.

    Leave a comment:


  • Micket
    replied
    Are GPU's without hardware support for ASTC limited to only decoding in advance and using a decompressed texture only?
    Could it no be implemented in software running in some shader? (at a cost of course, but I was under the impression that decoding ASTC would be fast (perhaps not?))

    Leave a comment:


  • GreatEmerald
    replied
    Originally posted by DrYak View Post
    As soon as S3TC started becoming popular, a "compressed texture CD" was released for Unreal back then.
    Small correction: it was released for Unreal Tournament, not Unreal, even though Unreal also had support for textures compressed directly in S3TC. That allowed the community to produce their own extremely high resolution textures compressed in S3TC to replace the stock ones (and even the official S3TC ones). Starting from Unreal Engine 2, that was no longer possible due to engine changes, but they still used S3TC compressed textures internally.

    Leave a comment:


  • DrYak
    replied
    Originally posted by Micket View Post
    I have a question about how games approach this;
    Do they compress the texture beforehand, or compress it live (when you load the game)?
    The latter might sound silly, but the main point is to save video memory right? If so, then you would be able to choose compression method.
    Depends on the engine:
    - Some engine do compress on the fly. For example the older idtech3 engine does that way. Textures are stored as JPEG files. At load time, the files are decompressed, and then either uploaded as-is on the graphic card, or recompressed on-the-flight and then uploaded.
    thus, at the cost of some (significant) load-time waiting, the game performs with very fast and efficient texture based on the latest available texture algorithm.
    As an example, this enabled Quake3 to benefit from the slightly more efficient (can also switch modes between blocks and has also some very efficient or higher quality modes) but less widespread (only 3DFx and Intel hardware) FXT1 texture compression, at a time when everybody else was only doing S3TC. It could also support the radically different VQ compression on DreamCast (powerVR) hardware. But it was slower to load on not top-of-the-crop hardware.
    Games based on this and similar engines will auto-magically be able to use ASTC on hardware which support it with no or near to none modification to the current code base. (The openGL driver is responsible for handling most of the ASTC)

    - Some games come with pre-compressed texture: As soon as S3TC started becoming popular, a "compressed texture CD" was released for Unreal back then. No much luck for ASTC then, except maybe modifying the engine to recompress-to-ASTC at load time. Which isn't that much useful: performance gain might be squeezed out due to more efficient ATSC compression. But it comes at a cost of longer loading time, and degraded quality (due to lossy decompression-recompression cycles).
    (In the case of hardware not supporting compression: textures are decompressed at load-time by the driver and stored uncompressed on the graphic card.)

    - Slightly more modern variation: textures aren't stored directly in S3TC on the disc, but into some pre-processed format which is both easy to recode into S3TC, but which also is nicely compressible by the CPU and stored in CPU-compressed container (like using LZMA compression). There are several such format which are popular with DirectX games.
    CPU unpacks the file and transcode that output into S3TC. Thus the files are smaller (like the Jpeg method, thanks to the CPU compression) but the load time is shorter (no need for full decompression and recompression, unpacking the file already yields something which is close to- and eady to convert- S3TC).

    Originally posted by halfmanhalfamazing View Post
    And does it require new hardware?
    Depends:
    - To just play a game which has its data stored into ASTC (or easily ASTC-transcodable format, say ASTC in a LZMA container): No new hardware is needed. In worst case, the driver can always decompress the texture in software and recompress it in whatever suits the hardware. (at the cost of longer loading times and degraded quality due to the lossy cycles of compression/decompression and probably slower performance because the other hardware-supported compression are probably less efficient).

    - To benefit from the advantages of ASTC (namely more efficient or higher quality compression) you *NEED* hardware whose texturing unit supports ASTC (so it can directly load the smaller, compressed texture from videoram into texture cache). As ASTC is patent-free and has tons of advantages (efficiency, quality), it will very likely get picked up rather quickly by hardware constructors. Probably too late for the generation of hardware currently in development, but probably in the next generation after that. That means that in 1 year max. there will be hardware around supporting ASTC.
    Also given the turn over of gaming machines, it means that in 3-4 years, there will be enough graphic cards around with ASTC support.
    Which mean that developer who start writing games right now have in their best interests to include support for ASTC. By the time the game is out, ASTC will be hardware supported on some machines, and the other can go the software "ASTC-to-S3TC" route. Or they can even prepare different data packages for ASTC and S3TC using the same assets.

    Leave a comment:


  • jonnyh
    replied
    Originally posted by Micket View Post
    I have a question about how games approach this;
    Do they compress the texture beforehand, or compress it live (when you load the game)?
    The /vast/ majority of (mobile, at least) games are compressed beforehand, so the artists can decide on what compression methods/modes are used and they all look acceptable for what they are used for.

    There is some interesting work on doing extra compression on an already compressed texture (As the compressed textures used in a graphics card need to be able to decompress a block independently, so they can't compress using repetition or patterns between blocks) such as Crunch (http://code.google.com/p/crunch/), and I believe that civ5 tries to use cuda to decompress from something like jpeg to dxt on the graphics card as the bottleneck is getting it from cpu memory to the graphics card memory. On mobile this is a bit different, as the cpu and gpu tend to share memory and bandwidth.

    Leave a comment:


  • Micket
    replied
    So. The license for the encoder/decoder released by ARM was nonfree, so I suppose one has to start by doing a free implementation.
    How tedious.

    Leave a comment:


  • MaxToTheMax
    replied
    Yeah, something like that.

    Leave a comment:


  • DeepDayze
    replied
    Originally posted by MaxToTheMax View Post
    Based on what I know of the two algorithms, ASTC is totally different, except for the basic similarity of both being block-based. They have basically zero chance of being binary compatible. Transcoding to S3TC on the fly is doable, but you lose the image quality benefits of ASTC and introduce a dependency on the S3TC patents as well.
    Ahh, so most likely then games (and drivers) will have to add support for ASTC natively such as detecting the compression libraries installed and prompting user to select one as part of initial game configuration

    Leave a comment:


  • MaxToTheMax
    replied
    ASTC for GIMP would be awesome!

    Both approaches are often used. I believe on desktop systems it's most common to have the graphics driver encode it for you on the fly, since it makes the data files of the game easier to work with.

    Leave a comment:

Working...
X