Announcement

Collapse
No announcement yet.

Crunch Texture Compression Showing Off Promising Results For Unity

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Crunch Texture Compression Showing Off Promising Results For Unity

    Phoronix: Crunch Texture Compression Showing Off Promising Results For Unity

    The Crunch texture compression library developed by former Valve Linux/OpenGL engineer Rich Geldreich who cofounded the Binomial consulting firm is making much progress with showing off impressive compression capabilities for game engines...

    http://www.phoronix.com/scan.php?pag...ing-Unity-2017

  • #2
    Rich Geldreich probably had to face quite some jokes about his name. His family name literally means "money rich" in German....

    Comment


    • #3
      Originally posted by jhenke View Post
      Rich Geldreich probably had to face quite some jokes about his name. His family name literally means "money rich" in German....
      So what? ≈65% of lastnames I see, both latin and cyrillic, directly mean something, and another ≈10-15% resemble real words.

      Comment


      • #4
        Valve realizes the thousands of asset flips on the content servers take up a lot of space, so they send a dev to help work on the Asset Flip Engine, so "games" get smaller, and Valve saves server space. Genius.

        Comment


        • #5
          Originally posted by eydee View Post
          Valve realizes the thousands of asset flips on the content servers take up a lot of space, so they send a dev to help work on the Asset Flip Engine, so "games" get smaller, and Valve saves server space. Genius.
          Some AAA games got artificially inflated for some obscured reason, like 40 to 60 GB on their console disk. Some says it is to discourage piracy. But some indie games are unbelievably large for the graphic and contents you see on the screen. I have some 2D, pixelated indie side scrolling games that are larger than 1 GB. You see them and ask yourself how they managed to get a 16 bit SNES/Mega Drive style game and inflate it enough to be so big on file size.

          Comment


          • #6
            at least one typo: ANdroid

            Comment


            • #7
              "The Crunch library continues to be hosted on GitHub but hasn't been updated there since January."

              As mentioned in the linked blog post, "The latest version of the Crunch library can be found in the following GitHub repository: https://github.com/Unity-Technologies/crunch/tree/unity"

              Comment


              • #8
                As someone who isn't a game developer but has spent some time with compression, I'm driven to ask if they're doing the right thing here. They quote improvements in compression speed, but tiny gains in compression ratio. As a 'compress once, store millions of times' task, isn't the compression ratio truely the most important metric? Time should be something you're willing to sacrifice a lot of to get better compression. Hearing them say it's 5x faster makes me wonder if they could have gotten better compression if they had spent 5x *longer* trying to do a better job.

                I understand that the write/compile/test loop needs to be kept short to keep developers productive, but you only need to recompress the textures when you change them--which I wouldn't expect to be that often--and you don't need to do the uber-super compression on them at that point--only before final testing. I guess the 5x faster can help a little in the case where the developer is tweaking textures. I don't know the workflow, is that so big of a concern?

                Comment


                • #9
                  Originally posted by [email protected] View Post

                  Some AAA games got artificially inflated for some obscured reason, like 40 to 60 GB on their console disk. Some says it is to discourage piracy. But some indie games are unbelievably large for the graphic and contents you see on the screen. I have some 2D, pixelated indie side scrolling games that are larger than 1 GB. You see them and ask yourself how they managed to get a 16 bit SNES/Mega Drive style game and inflate it enough to be so big on file size.
                  Because it's built using a high-level IDE where programming isn't required. It imports everything by default and runs layers on layers of bloat.

                  I've played a few indie games in the last year that had install files of 20-80 MB, and I thought it was suspiciously small, but the graphics and games turned out amazing. And some game are legit 40-60 GB because textures. The hi-res texture pack for fallout 4 is 55 GB by itself, and I felt it's worth it.

                  Comment


                  • #10
                    Originally posted by willmore View Post
                    As someone who isn't a game developer but has spent some time with compression, I'm driven to ask if they're doing the right thing here. They quote improvements in compression speed, but tiny gains in compression ratio. As a 'compress once, store millions of times' task, isn't the compression ratio truely the most important metric? Time should be something you're willing to sacrifice a lot of to get better compression. Hearing them say it's 5x faster makes me wonder if they could have gotten better compression if they had spent 5x *longer* trying to do a better job.

                    I understand that the write/compile/test loop needs to be kept short to keep developers productive, but you only need to recompress the textures when you change them--which I wouldn't expect to be that often--and you don't need to do the uber-super compression on them at that point--only before final testing. I guess the 5x faster can help a little in the case where the developer is tweaking textures. I don't know the workflow, is that so big of a concern?
                    Not in this case, IMHO. The number 1 reason from our users (I work at Unity) in "why you don't use Crunch" has been "compression is too slow", and number 2 has been "oh but I need mobile support". This branch on github (linked from original article) is fixing these two problems exactly.

                    The additional compression filesize gains are just a kinda unrelated bonus but were not even the primary concern. Crunch compression is pretty good already -- you could think of it kinda "roughly JPEG-size". Would investing into "better compressed JPEG" be worth it? Possibly. But hardly you'd come up with something that improves JPEG compression by 5x or similar factor.


                    BTW, the article here on Phoronix makes it sound like this has something to do with Valve (it doesn't), or that the improvements have not been open sourced (they are right there on github; in the link that original blog post says).

                    Comment

                    Working...
                    X