Announcement

Collapse
No announcement yet.

H.266/VVC Standard Finalized With ~50% Lower Size Compared To H.265

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #81
    Originally posted by uid0 View Post
    Could you describe what the differences sound like? Is there a perceptible signature for each codec (at higher bitrates), so you can tell them apart?
    Yes, I'll describe the differences which I hear on the most "damaged" instruments. The one I complained about most was MP3, at it's highest "standard" bit rate of 320kb/sec. I am a pro musician, now elderly, and I would have noticed even bigger differences 10 or 15 years ago - my ears are now somewhat damaged.

    It's less a perceptible signature for each codec, and more of a generic complaint that "this doesn't sound right". MP3 would NOT be a lot worse than AAC if higher bit rates were in wider use. But 3 orchestra instruments show the biggest problems with MP3 at 320kb "standard" high quality. Here are the details for those 3, which I will try to describe in detail.

    With MP3, even at 320k, the most obvious difference occurs with cymbals (good ones). They lose the qualities of 'air' and 'space' which they have on real life. FLAC can mostly maintain those qualities, being compromised primarily by the quality of the microphones and speakers/headphones which you use for playback. AAC is a bit compromised in these properties, but very close to FLAC when recorded 384kb or higher. When raising MP3 bitrate to 384kb, I am still about 70% correct in guessing AAC versus MP3 samples - that's better than 2:1 accuracy in guessing, and indicates to me that AAC is a superior compression codec.

    With MP3 at the highest standard rate of 320kb, a solo violin looses most of the characteristics of "woody" and "moving the air", sounding more like a synthesizer-generated tone for generic "strings". And 3rd, among the wind instruments, the oboe is the most damaged, sounding more and more "like just another clarinet" when the bit rateis pushed too far down down or the codec is "bad".

    Opus, tuned for full-bandwidth music, nearly matches AAC 384Kb while using only only 256Kb bandwidth. But when When I turn it down to only 128Kb, it is slightly worse than MP3-320kb -- my ears are more aware that "this sounds like cheap stereo, rather than real life".

    Comment


    • #82
      Originally posted by SirMaster View Post

      Reference H.265 did against reference H.264.

      I think you are mistakenly comparing early reference H.265 encoders vs. years and years of optimized x264 encoder.

      If you want to compare to x264, you need to use a recent x265 encoder build which is now quite a lot better.

      I encode 4K content with x265 and am able to get much smaller files at the same quality as x264 these days.
      Are you getting 50% smaller files while targeting a *very* high quality level?
      I don't play with x265 since some time, but I guess it's still far from this goal. It was good when targeting extremely small sizes, but when your target is blue-ray-ish quality it wasn't nearly as good.
      ## VGA ##
      AMD: X1950XTX, HD3870, HD5870
      Intel: GMA45, HD3000 (Core i5 2500K)

      Comment


      • #83
        Originally posted by curfew View Post
        No it doesn't.
        I'm talking about real world code complexity and the target bitrates being halved. 0.5x the bitrate means 2x the code complexity IN THIS CASE
        Last edited by lyamc; 07-07-2020, 01:14 PM.

        Comment


        • #84
          Originally posted by pal666 View Post
          flac has nothing to do with decency of audio setup, it can't be distinguished from high bitrate opus in blind testing. there's a case for it as a source format, but it doesn't make sense for listening
          I respectfully disagree. Commodity audio gear usually does not have the (ADC, amplifiers, speakers, etc) bandwidth to reproduce the information stored in uncompressed files. Such gear typically compromises in a similar spectrum where lossy audio format discard information.
          This is also a difficult topic to blind test because the results depend on the audio system, group/background of listeners, etc.

          For reference, I know folks with $10k++ audio setups that claim that they can tell the difference between CD quality and high-resolution PCM or DSD files. I don't think I would survive such a blind test on my setup but I can definitely tell the difference between 192kbps mp3 files (possibly higher but I didn't try) and CD quality especially on recordings that I know.

          Of course, any lossy audio format tends to a lossless WAV/flac if you throw enough bitrate at it. However at that point, I would argue what is the point?
          For music that I care about, I have no problem with ~15MB files for ~3min tracks.
          Last edited by mppix; 07-07-2020, 03:15 PM.

          Comment


          • #85
            Originally posted by jonsmirl View Post
            It is just going to be another patent royalty train wreck like h.265.
            There are many patent claims on pretty much any/all codecs (it is the nature of such advanced work). And there are (last I glanced in that direction) over 1000 claims just on AV1 technology (which was designed to be royalty free), although whether any/all of those claims are valid are still in various stages of review/discussion. If any of the claims are upheld in any jurisdiction the promise of royalty free may not survive (although the members of the consortium might just buy out the rights, as has been done before in certain cases of IP). H.265 was, of course, especially problematic because of multiple partially overlapping, but none complete, patent pools, along with a few claims that were not in any pool, and it appears that H.266 may have a similar problem (multiple competing pools), although some of the major participants are well aware that multiple pools are not at all desirable for adoption.

            Comment


            • #86
              Originally posted by rickst29 View Post
              I disagree with this statement- but I have high quality source material (actual recording sessions of acoustic instruments, singers, and genuine percussion instruments). When I downgrade my 24-bit/96000 recordings into something else for distribution, the differences between AAC @ 384kb and MP3 @ 384kb is obvious. But MP3 presents another quality problem: AFAIK, only LAME can encode at more than 320Kb, and many consumer devices can't handle higher nonstandard bit-rates. I always store my originals as FLAC. For compressed audio at lower bit rates, I prefer OPUS > AAC > AC3 > MP3, although I don't use OPUS very often. (I only encode to Opus when creating VP9 Videos).

              If the comparison is made by starting with a highly compressed and typically mis-engineered pop music CD - then yeah, 320Kb MP3 will be nearly as "good" as the original.
              - - - -
              BTW, I play on a violin which is well into 5 figures, and own two decent microphones.
              I would like to second these statements.

              Comment


              • #87
                Originally posted by lyamc View Post
                I'm talking about real world code complexity and the target bitrates being halved. 0.5x the bitrate means 2x the code complexity IN THIS CASE
                Show me the benchmarks that prove this, then. How can you talk about real-world performance when there is even no real-world implementation available?

                Comment


                • #88
                  How does the new codec compare to H.264 et.al. when they were compared to H.265? If I double or triple the H.265 datarate then compare it to H.266 and claim the average Joe can't tell the difference when H.266 uses half the data rate. How does it scale across resolution and bit rates? How does it handle slow pans vs fast action sport? Text and texture vs approx. pretending trees?
                  The larger the windows of data (more frames) the easier it is to optimise compression so how much RAM, MIPS/FLOPS and passes does it need per frame? How long will it take to get HW decoders in GPU, recording and playback devices like computers, notebooks, phones, cameras, TV, capture, video mixers and VideoOverIP?

                  Earlier attempts offered features & settings (profiles) to increase quality and minimise bandwidth at the cost of processing & memory resources. However, many decoders aimed lower so people encoding aim for a lower bar for better compatibility.

                  Just because you can reduce the bandwidth, doesn't mean you should. I would much rather see movies and sport as the cameraman and directors see it than with the mashed potato effect of high compression (too much change/movement). I'd much rather see 2 or 3 good channels than 3 additional channels of rubbish that reduce the quality of the main TV channel in the stream (lower res, lower fps, extreme compression). Even popular classical music is trashed by high compression digital TV compared to DVD/Blueray, digital radio compared to CD. I don't need 4K to see 720P, I want 1080P to look like 1080P instead of 540P. It doesn't matter how fancy the display or audio system if what's going in looks or sounds like a waterfall over the top.

                  Comment


                  • #89
                    Originally posted by tygrus View Post
                    Just because you can reduce the bandwidth, doesn't mean you should. I would much rather see movies and sport as the cameraman and directors see it than with the mashed potato effect of high compression (too much change/movement). I'd much rather see 2 or 3 good channels than 3 additional channels of rubbish that reduce the quality of the main TV channel in the stream (lower res, lower fps, extreme compression). Even popular classical music is trashed by high compression digital TV compared to DVD/Blueray, digital radio compared to CD. I don't need 4K to see 720P, I want 1080P to look like 1080P instead of 540P. It doesn't matter how fancy the display or audio system if what's going in looks or sounds like a waterfall over the top.
                    In most countries, digital TV has been broadcasted in 720p or 1080i H.264 and 1080p/H.265 was only recently adopted, see e.g. DVB-T2.
                    It will still take some time before 8K/H.266 becomes relevant even for playback of digital media.

                    Comment


                    • #90
                      Originally posted by rickst29 View Post
                      Yes, I'll describe the differences which I hear on the most "damaged" instruments. The one I complained about most was MP3, at it's highest "standard" bit rate of 320kb/sec. I am a pro musician, now elderly, and I would have noticed even bigger differences 10 or 15 years ago - my ears are now somewhat damaged.
                      That long ago it was actually easy because many MP3 encoders sucked big time. I remember picking MusicMatch Jukebox as the software to rip my CDs with because it was "blazing fast" in comparison to some other apps. But it took me a couple years to realize that the speed was achieved by using a god-awful encoder. Anyone with $20 headphones would have noticed that.

                      It is basically proven that 192 kbps MP3 is enough to produce indistinguishable compression with the LAME encoder. Actually I believe the threshold to be at around 128 kbps for most music pieces. Of course you can fabricate the results to your own biases by picking a bad encoder.

                      Comment

                      Working...
                      X