Announcement

Collapse
No announcement yet.

Upcoming Maxwell GPUs Will Support H.265, But VP9 Is Uncertain

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by deanjo View Post
    JVC GY-HMQ10 footage. There are also 4k streams available online from youtube. (Netflix also just started streaming 4k on a few titles).

    http://blog.netflix.com/2014/05/netf...tra-hd-4k.html



    5 Watts is still roughly 20 times the amount of power required than what dedicated hardware can offer. Don't forget that commercially available content is going to be encoded to the capabilities of what the baseline is for dedicated hardware can handle. They have to keep their content playable on the mainstream AV equipment.
    The relative power difference isn't what matters, which is the point you don't seem to understand. The savings in hard drive storage for my media server already shave off more than 5 watts, and I don't have to buy multiple hard drives. 5 Watts is still, realistically and practically, very low for decoding HD content on a two year old mobile processor. Even the light bulb above my head consumes far more energy, yet you don't see me complaining about it.

    I don't particularly care what mainstream AV equipment is capable of or what the original source was transocded into -- I don't use mainstream AV equipmentand I don't playback original sources. BDs from the store are already vastly bloated in file size compared to the carefully transcoded media I have collected over the years.

    Comment


    • #17
      Originally posted by mmstick View Post
      The relative power difference isn't what matters, which is the point you don't seem to understand. The savings in hard drive storage for my media server already shave off more than 5 watts, and I don't have to buy multiple hard drives. 5 Watts is still, realistically and practically, very low for decoding HD content on a two year old mobile processor. Even the light bulb above my head consumes far more energy, yet you don't see me complaining about it.
      What you don't seem to understand is that if the encoding/decoding is done via hardware or software the power savings by having a smaller file size that allows less hard drives to be used applies to both regardless.

      I don't particularly care what mainstream AV equipment is capable of or what the original source was transocded into -- I don't use mainstream AV equipmentand I don't playback original sources. BDs from the store are already vastly bloated in file size compared to the carefully transcoded media I have collected over the years.
      You have to realize that your use case is not the norm. The norm is what dictates if it is worth while for development efforts to be put into an implementation. No matter how careful and what uber settings you use, you will always have further degradations in IQ. If you can notice it on your equipment or not is another matter entirely.
      Last edited by deanjo; 05-10-2014, 11:37 PM.

      Comment


      • #18
        Originally posted by deanjo View Post
        What you don't seem to understand is that if the encoding/decoding is done via hardware or software the power savings by having a smaller file size that allows less hard drives to be used applies to both regardless.



        You have to realize that your use case is not the norm. The norm is what dictates if it is worth while for development efforts to be put into an implementation. No matter how careful and what uber settings you use, you will always have further degradations in IQ. If you can notice it on your equipment or not is another matter entirely.
        Incorrect, if I use standard profile H.264 that can be hardware-decoded, the file size becomes significantly larger to get a similar image quality. Compressing 8-bit media into 10-bit allows the encoder to save extra space as it handles color gradients significantly better. With the enhanced capability to combat banding, you can use a lower CRF value in combination with maximum compression settings to cut the file size down massively. My entire 10-bit H.264 archive is compressed with placebo with the CRF value low enough that image quality has not been decreased in any way that is noticeable. That archive currently takes up 2.1 TB. The original sources were several times larger than these, which could have taken many 3TB hard drives to hold.

        My use case is irrelevant. H.265 and VP9 are not the norm today, so why are hardware manufactures going to support it? Instead of choosing to support H.265 which has failed before it even released, in the same way Opus defeated the last proprietary next-gen audio codec and is now 'mainstream', it'd be more wise to select a codec that has even better technologies to develop around. As it stands, the reference H.265 codec isn't much better than H.264 10-bit with the latest x264.

        Comment


        • #19
          Originally posted by mmstick View Post
          Incorrect, if I use standard profile H.264 that can be hardware-decoded, the file size becomes significantly larger to get a similar image quality. Compressing 8-bit media into 10-bit allows the encoder to save extra space as it handles color gradients significantly better. With the enhanced capability to combat banding, you can use a lower CRF value in combination with maximum compression settings to cut the file size down massively.
          Using 10-bit on a 8-bit source only combats to some extent further banding in gradients. It is still however not as good as the original 8-bit source. That is just fact. It doesn't get better, it degrades as is the case with any other generation reduction encoding. Ripping a CD that was created from 128k mp3 to 256k mp3 doesn't magically make the sound better, it still degrades.

          My entire 10-bit H.264 archive is compressed with placebo with the CRF value low enough that image quality has not been decreased in any way that is noticeable.
          Ya the placebo effect it definitely working on you.

          My use case is irrelevant. H.265 and VP9 are not the norm today, so why are hardware manufactures going to support it? Instead of choosing to support H.265 which has failed before it even released, in the same way Opus defeated the last proprietary next-gen audio codec and is now 'mainstream', it'd be more wise to select a codec that has even better technologies to develop around. As it stands, the reference H.265 codec isn't much better than H.264 10-bit with the latest x264.
          H.265 is already present in many of this year devices. You can go out and buy consumer goods right now that support it. My new Samsung UN65HU8550F supports it out of the box (also supports VP8 btw as well). The new prosumer 4k cameras are also supporting h265.

          Comment


          • #20
            Originally posted by deanjo View Post
            Using 10-bit on a 8-bit source only combats to some extent further banding in gradients. It is still however not as good as the original 8-bit source. That is just fact. It doesn't get better, it degrades as is the case with any other generation reduction encoding.
            You missed the point. He said that using 10-bit colour depth and other advanced features most hardware decoders (not to mention encoders) do not support you can get comparable visual quality from the same source material at lower bitrates. That is also fact.

            Comment


            • #21
              I feel like the hardware has moved faster than the software for some time now. Why bother making .265 when in 2y time it will probably play back on a phone for less than 1W. I feel like nvidia is just buying time.

              Comment


              • #22
                Originally posted by tuubi View Post
                You missed the point. He said that using 10-bit colour depth and other advanced features most hardware decoders (not to mention encoders) do not support you can get comparable visual quality from the same source material at lower bitrates. That is also fact.
                All h265 encoders/decoders support 10 bit (and beyond). It's part of the spec. Everything is "comparable" btw, you still have degradation and that is fact.

                Comment


                • #23
                  Originally posted by deanjo View Post
                  All h265 encoders/decoders support 10 bit (and beyond). It's part of the spec. Everything is "comparable" btw, you still have degradation and that is fact.
                  You have a serious case of selective reading.

                  Name one H.264 hardware decoder that supports 10-bit H.264 decoding: None.
                  How long has 10-bit Profile H.264 been available? Years.
                  How long has H.265 hardware decoders been available, along with H.265 encoders? Exactly.

                  H.265 is still essentially not very different from H.264 from a technical standpoint, and offers little benefit over the existing H.264 10-bit profile. VP9 isn't too interesting either. Only Daala is interesting as it's implementing newer, better technologies -- it's an entire generation ahead of H.265.

                  Having encoded over 14,000 episodes of media, I can't share the same view as you when you claim degradation as if that's a problem. I've done many blind tests over the last three years and no one can tell the difference between the original source and my encodes. If whatever degradation that occurs is not noticeable, then why does it even matter? Standard BluRay media uses causes incredible bitrate bloat which can be mitigated by re-encoding in a better format. Encoding technologies have come a long way, and with 10-bit, errors made by the encoder/decoder are mitigated to the point where quality looks the same after re-encoding if your CRF value is sufficient enough.

                  Want to know what's really degraded? Netflix streams. 4K Netflix is useless since Netflix bitrate starves their content which makes high resolution content practically useless. As it stands today, I still see content getting released on BluRay that's actually just a 720p upscale to 1080p. I'd rather see high quality content compressed at the source with the best methods (Daala + Opus) than the low quality stuff we have today -- at least that would give a reason to not have to re-encode content for sake of getting rid of the bloat.
                  Last edited by mmstick; 05-11-2014, 06:53 PM.

                  Comment


                  • #24
                    Originally posted by zanny View Post
                    If you live in the US, and are encoding video, say to put on youtube, you are violating MPEG-LA licensing terms by using x264, for example. And how many commercial h.264 encoders are people using? The entire industry seems to be x264 anyway, and it is basically a time bomb waiting to blow whenever MPEG-LA goes all Oracle on everyone, or at least x264 distributors.

                    We need an open codec that isn't a legal black hole. I'm still hoping Dalaa can be accelerated with openCL enough to do a good job, because vp9 is more of a stop gap based on old tech and it isn't really competitive with either h.264/5 at all.
                    You're never going to get an "open codec that isn't a legal black hole". You're assuming the problem is somehow technical, whereas the problem is the brokenness of the current patent system/ To do ANYTHING complicated means you're going to violate some nonsense that you had no idea anyone would be stupid enough to patent.
                    What protects x264 is simply the fact that there's no money to go after. Daala will change nothing. As long as it, too, is not attached to anyone with money, no-one will go after it. As soon as it gets adopted by anyone with money, you'll discover that it's infringing fifteen ridiculous patents that should never have been granted.

                    You fix the problem but fixing the patent system, not by imaging that by doing things "right" this one time you'll magically create something that won't attract the trolls.
                    Last edited by name99; 05-11-2014, 10:43 PM.

                    Comment


                    • #25
                      Originally posted by name99 View Post
                      You're never going to get an "open codec that isn't a legal black hole". You're assuming the problem is somehow technical, whereas the problem is the brokenness of the current patent system/ To do ANYTHING complicated means you're going to violate some nonsense that you had no idea anyone would be stupid enough to patent.
                      What protects x264 is simply the fact that there's no money to go after. Daala will change nothing. As long as it, too, is not attached to anyone with money, no-one will go after it. As soon as it gets adopted by anyone with money, you'll discover that it's infringing fifteen ridiculous patents that should never have been granted.

                      You fix the problem but fixing the patent system, not by imaging that by doing things "right" this one time you'll magically create something that won't attract the trolls.
                      The same people behind x264 were developing a next-gen M4A/AAC but it was shot down by the superior Opus codec developed by the same people making Daala. Now the thing is, Opus is using technology, and is being used, by many big players in the industry. In fact, the entire international mobile network is fueled by opus for making voice calls. Even Steam and Skype are using a technology that is a part of Opus (Silk). No one has gone after Opus though.

                      Comment


                      • #26
                        Originally posted by mmstick View Post
                        Name one H.264 hardware decoder that supports 10-bit H.264 decoding: None.
                        How long has 10-bit Profile H.264 been available? Years.
                        How long has H.265 hardware decoders been available, along with H.265 encoders? Exactly.

                        H.265 is still essentially not very different from H.264 from a technical standpoint, and offers little benefit over the existing H.264 10-bit profile. VP9 isn't too interesting either. Only Daala is interesting as it's implementing newer, better technologies -- it's an entire generation ahead of H.265.
                        So the main difference between 10-bit H.264 and H.265 is that hardware decoders for H.265 actually exist?
                        Seems like a good selling point to me.

                        Comment

                        Working...
                        X