You said earlier that there were lots of cases where this had happened. Can you please point to a single example? Or is this completely based on your imagination of what you wish would happen?
Originally Posted by droidhacker
I can only say it's your choice whether you obey the law or not. And frankly, this isn't just about clearly breaking some law or not, this whole situation is unclear and no major organisation is willing to take the risk of finding out. I can understand that, but why not shift the risk to users then?
Originally Posted by [Knuckles]
Hypothetically speaking, how would S3 even go about enforcing this patent in case end-users violate it? And consider who's the end-user in this case. I don't think they'd have an easy time doing that.
You don't understand how GPUs work or what texture compression accomplishes. DXTC/S3TC is specifically designed to be (near) instantly decompressable with a massively-parallel processor without needing to make a large number of memory accesses.
Originally Posted by V!NCENT
Deflate/gzip/lzma/etc. all require you to essentially start reading a file at the beginning and scanning through it to the part you want to access. That would mean a pixel shader goes from an O(1) operation on a simple texture access to an O(N) operation. You'd essentially slam the GPU back into the stone age by requiring that kind of texture access at runtime.
If you're implying that we should just store textures uncompressed in video memory, then you are making two very silly assumptions. First, you're assuming that having 1GB of video memory means that we can just slap uncompressed raw textures for a modern game into vmem and have room to spare; that's just ridiculously wrong. Second, you're assuming that the size of memory has no impact on performance of memory accesses inside of shaders, which is also wrong for any well-designed memory controller (especially when the underlying memory layout uses tiling to make any additional texture lookups in a particular shader likely to all closely coincide with the address of first lookup).
Thanks for the suggestion, though. I'm sure the hardware engineers and specification authors never thought of that before, ever. Your power of Internet forum logic will surely revolutionize the GPU industry and free them of the evil patent tyrants.
There are other, unpatented, texture compression schemes, though. They just aren't used as often as S3TC because that was put into DirectX a long time ago.
Notably including RGTC, which is required by OpenGL 3.0, and already supported in Mesa on at least swrast, i965c, r300g, r600g and nouveau.
Originally Posted by smitty3268
Mesa also supports LATC, but only only on swrast, r300g and r600g. LATC is, like S3TC, not required by any OpenGL version, but still supported by the blobs.
Developers of free games should probably investigate whether RGTC or LATC is suitable for their needs before committing to S2TC...
Last edited by Jonno; 07-20-2011 at 07:25 PM.
Here are summaries of the image file format .gif patent story
Maybe they can serve how to handle this patent.
As I understand it (correct me if I'm wrong), the basic idea behind this workaround is that if games pass an S3TC compressed texture to mesa, it will pass it through to the graphics card which will uncompress it in hardware. Presumably the assumption here is that the graphics card vendor has a patent license for S3TC decode so all will be well.
Is this true, though? I have a vague memory of reading something somewhere that vendors' patent rights are very narrow and may only cover the use of the graphics hardware in combination with the proprietary driver, so having an open source driver tell the hardware, here, decode this, may still run afoul of the patent.
You may be right, but Mesa already takes the risk of passing raw data through.
Neither is, as not enough graphics chipsets support them yet
Originally Posted by Jonno
RGTC also is only suitable for normalmaps - it is "Red/Green Texture Compression" after all and cannot encode a blue channel. And LATC only contains luminance and alpha.
Furthermore, from reading the RGTC extension spec, I am very sure it infringes on the S3TC patent, as it makes use of the very same compression technique (namely, inferred color values). In fact, it is identical to S3TC/DXT5 alpha channel encoding. Instead, RGTC and LATC are special cases that I maybe should support with a subset in my S2TC compressor too. I guess Mesa is taking the risk, because the format has a different name and RGTC support is not "universally known" to be mostly identical to one S3TC mode.
Last edited by divVerent; 07-21-2011 at 12:45 AM.
A little update on the 0 A.D. hack as I remember more: on Linux it tried to initialize OpenGL, if S3TC ext is not available it setenv force_s3tc_enable=1 (env variable to force S3TC in mesa for S3TC decompression only) and reinitialize OpenGL. If S3TC ext is now available it will use it for decompressing only. Dunno if it still
Anyway I remember there were a lot of proposal to workaround the S3TC patent, e.g: http://cgit.freedesktop.org/~csimpso...tc-by-the-book
Just search on the ML for many others but no one was ever accepted.
This somewhat reminds me the attempts at proving the Riemann hypothesis...