Announcement

Collapse
No announcement yet.

Khronos ASTC: Royalty-Free Next-Gen Texture Compression

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Micket View Post
    I have a question about how games approach this;
    Do they compress the texture beforehand, or compress it live (when you load the game)?
    The latter might sound silly, but the main point is to save video memory right? If so, then you would be able to choose compression method.
    Depends on the engine:
    - Some engine do compress on the fly. For example the older idtech3 engine does that way. Textures are stored as JPEG files. At load time, the files are decompressed, and then either uploaded as-is on the graphic card, or recompressed on-the-flight and then uploaded.
    thus, at the cost of some (significant) load-time waiting, the game performs with very fast and efficient texture based on the latest available texture algorithm.
    As an example, this enabled Quake3 to benefit from the slightly more efficient (can also switch modes between blocks and has also some very efficient or higher quality modes) but less widespread (only 3DFx and Intel hardware) FXT1 texture compression, at a time when everybody else was only doing S3TC. It could also support the radically different VQ compression on DreamCast (powerVR) hardware. But it was slower to load on not top-of-the-crop hardware.
    Games based on this and similar engines will auto-magically be able to use ASTC on hardware which support it with no or near to none modification to the current code base. (The openGL driver is responsible for handling most of the ASTC)

    - Some games come with pre-compressed texture: As soon as S3TC started becoming popular, a "compressed texture CD" was released for Unreal back then. No much luck for ASTC then, except maybe modifying the engine to recompress-to-ASTC at load time. Which isn't that much useful: performance gain might be squeezed out due to more efficient ATSC compression. But it comes at a cost of longer loading time, and degraded quality (due to lossy decompression-recompression cycles).
    (In the case of hardware not supporting compression: textures are decompressed at load-time by the driver and stored uncompressed on the graphic card.)

    - Slightly more modern variation: textures aren't stored directly in S3TC on the disc, but into some pre-processed format which is both easy to recode into S3TC, but which also is nicely compressible by the CPU and stored in CPU-compressed container (like using LZMA compression). There are several such format which are popular with DirectX games.
    CPU unpacks the file and transcode that output into S3TC. Thus the files are smaller (like the Jpeg method, thanks to the CPU compression) but the load time is shorter (no need for full decompression and recompression, unpacking the file already yields something which is close to- and eady to convert- S3TC).

    Originally posted by halfmanhalfamazing View Post
    And does it require new hardware?
    Depends:
    - To just play a game which has its data stored into ASTC (or easily ASTC-transcodable format, say ASTC in a LZMA container): No new hardware is needed. In worst case, the driver can always decompress the texture in software and recompress it in whatever suits the hardware. (at the cost of longer loading times and degraded quality due to the lossy cycles of compression/decompression and probably slower performance because the other hardware-supported compression are probably less efficient).

    - To benefit from the advantages of ASTC (namely more efficient or higher quality compression) you *NEED* hardware whose texturing unit supports ASTC (so it can directly load the smaller, compressed texture from videoram into texture cache). As ASTC is patent-free and has tons of advantages (efficiency, quality), it will very likely get picked up rather quickly by hardware constructors. Probably too late for the generation of hardware currently in development, but probably in the next generation after that. That means that in 1 year max. there will be hardware around supporting ASTC.
    Also given the turn over of gaming machines, it means that in 3-4 years, there will be enough graphic cards around with ASTC support.
    Which mean that developer who start writing games right now have in their best interests to include support for ASTC. By the time the game is out, ASTC will be hardware supported on some machines, and the other can go the software "ASTC-to-S3TC" route. Or they can even prepare different data packages for ASTC and S3TC using the same assets.

    Comment


    • #32
      Originally posted by DrYak View Post
      As soon as S3TC started becoming popular, a "compressed texture CD" was released for Unreal back then.
      Small correction: it was released for Unreal Tournament, not Unreal, even though Unreal also had support for textures compressed directly in S3TC. That allowed the community to produce their own extremely high resolution textures compressed in S3TC to replace the stock ones (and even the official S3TC ones). Starting from Unreal Engine 2, that was no longer possible due to engine changes, but they still used S3TC compressed textures internally.

      Comment


      • #33
        Are GPU's without hardware support for ASTC limited to only decoding in advance and using a decompressed texture only?
        Could it no be implemented in software running in some shader? (at a cost of course, but I was under the impression that decoding ASTC would be fast (perhaps not?))

        Comment


        • #34
          Originally posted by Micket View Post
          Are GPU's without hardware support for ASTC limited to only decoding in advance and using a decompressed texture only?
          Could it no be implemented in software running in some shader? (at a cost of course, but I was under the impression that decoding ASTC would be fast (perhaps not?))
          That would only be viable on a high-end gpu. And on such a gpu you're not usually lacking vram or bw, so little reason to do so.

          Note that this is on general shader texture decompression. I have no idea whether it's feasible to even do ASTC in a shader.

          Comment

          Working...
          X