Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 26

Thread: Upcoming Maxwell GPUs Will Support H.265, But VP9 Is Uncertain

  1. #11
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,584

    Default

    Quote Originally Posted by plonoma View Post
    Exactly, applications can choose to have the OpenCL encoder / decoder built in or use a video codec framework e.g. gstreamer or ffmpeg. Even offer the user the choice between different built in and not built in framework codec encoders / decoders.
    Utilizing a openCL encoder/decoder is more then likely going to be far less energy efficient than dedicated logic to do so. As the handbrake crew has found out, openCL has limits for encoding/decoding and dedicated hardware blows right by it in terms of efficiency and performance.

  2. #12
    Join Date
    Aug 2012
    Posts
    411

    Default

    Quote Originally Posted by deanjo View Post
    Utilizing a openCL encoder/decoder is more then likely going to be far less energy efficient than dedicated logic to do so. As the handbrake crew has found out, openCL has limits for encoding/decoding and dedicated hardware blows right by it in terms of efficiency and performance.
    The problem with dedicated hardware though is that it has very limited support for compression technologies and formats, so you lose quality for the sake of compatibility. I still vote for proper support for Daala over H.265 or VP9. There is a possibility that it will get support for OpenCL beyond simple multithreaded de/encoding so even if it doesn't get hardware support, at least it will have good performance by itself. CPUs have shrunk to the point that the energy consumed by software rendering is still relatively non-existent. In example, I have a laptop that consumes no more than 5W when decoding a 1080p 10-bit H.264 video with software.

  3. #13
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,584

    Default

    Quote Originally Posted by mmstick View Post
    The problem with dedicated hardware though is that it has very limited support for compression technologies and formats, so you lose quality for the sake of compatibility. I still vote for proper support for Daala over H.265 or VP9. There is a possibility that it will get support for OpenCL beyond simple multithreaded de/encoding so even if it doesn't get hardware support, at least it will have good performance by itself. CPUs have shrunk to the point that the energy consumed by software rendering is still relatively non-existent. In example, I have a laptop that consumes no more than 5W when decoding a 1080p 10-bit H.264 video with software.
    1080 is a pretty low entry level now days and 5 watts is still fairly high when you have dedicated hardware that does it in less than half a watt. I'm decoding and playing back 4k 60P h264 content easily on a lowly celeron 1610 with a GTX 750ti and power consumption barely rises above idle @ ~4% cpu utilization.

  4. #14
    Join Date
    Aug 2012
    Posts
    411

    Default

    Quote Originally Posted by deanjo View Post
    1080 is a pretty low entry level now days and 5 watts is still fairly high when you have dedicated hardware that does it in less than half a watt. I'm decoding and playing back 4k 60P h264 content easily on a lowly celeron 1610 with a GTX 750ti and power consumption barely rises above idle @ ~4% cpu utilization.
    I wasn't aware that movies and TV came in a resolution beyond 1080p @ 24 FPS. 5 Watts really isn't that much -- even my graphics card in my desktop consumes more than that when idle. Looking at the new mobile APUs, they can probably do what my laptop does in less than a watt. The space savings I get from having all my content encoded at maximum compression parameters with 10-bit H.264 is enough to save the need to buy multiple 3TB hard drives with a greater risk for drive failure.

  5. #15
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,584

    Default

    Quote Originally Posted by mmstick View Post
    I wasn't aware that movies and TV came in a resolution beyond 1080p @ 24 FPS.
    JVC GY-HMQ10 footage. There are also 4k streams available online from youtube. (Netflix also just started streaming 4k on a few titles).

    http://blog.netflix.com/2014/05/netf...tra-hd-4k.html

    5 Watts really isn't that much -- even my graphics card in my desktop consumes more than that when idle. Looking at the new mobile APUs, they can probably do what my laptop does in less than a watt. The space savings I get from having all my content encoded at maximum compression parameters with 10-bit H.264 is enough to save the need to buy multiple 3TB hard drives with a greater risk for drive failure.
    5 Watts is still roughly 20 times the amount of power required than what dedicated hardware can offer. Don't forget that commercially available content is going to be encoded to the capabilities of what the baseline is for dedicated hardware can handle. They have to keep their content playable on the mainstream AV equipment.
    Last edited by deanjo; 05-10-2014 at 10:40 PM.

  6. #16
    Join Date
    Aug 2012
    Posts
    411

    Default

    Quote Originally Posted by deanjo View Post
    JVC GY-HMQ10 footage. There are also 4k streams available online from youtube. (Netflix also just started streaming 4k on a few titles).

    http://blog.netflix.com/2014/05/netf...tra-hd-4k.html



    5 Watts is still roughly 20 times the amount of power required than what dedicated hardware can offer. Don't forget that commercially available content is going to be encoded to the capabilities of what the baseline is for dedicated hardware can handle. They have to keep their content playable on the mainstream AV equipment.
    The relative power difference isn't what matters, which is the point you don't seem to understand. The savings in hard drive storage for my media server already shave off more than 5 watts, and I don't have to buy multiple hard drives. 5 Watts is still, realistically and practically, very low for decoding HD content on a two year old mobile processor. Even the light bulb above my head consumes far more energy, yet you don't see me complaining about it.

    I don't particularly care what mainstream AV equipment is capable of or what the original source was transocded into -- I don't use mainstream AV equipmentand I don't playback original sources. BDs from the store are already vastly bloated in file size compared to the carefully transcoded media I have collected over the years.

  7. #17
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,584

    Default

    Quote Originally Posted by mmstick View Post
    The relative power difference isn't what matters, which is the point you don't seem to understand. The savings in hard drive storage for my media server already shave off more than 5 watts, and I don't have to buy multiple hard drives. 5 Watts is still, realistically and practically, very low for decoding HD content on a two year old mobile processor. Even the light bulb above my head consumes far more energy, yet you don't see me complaining about it.
    What you don't seem to understand is that if the encoding/decoding is done via hardware or software the power savings by having a smaller file size that allows less hard drives to be used applies to both regardless.

    I don't particularly care what mainstream AV equipment is capable of or what the original source was transocded into -- I don't use mainstream AV equipmentand I don't playback original sources. BDs from the store are already vastly bloated in file size compared to the carefully transcoded media I have collected over the years.
    You have to realize that your use case is not the norm. The norm is what dictates if it is worth while for development efforts to be put into an implementation. No matter how careful and what uber settings you use, you will always have further degradations in IQ. If you can notice it on your equipment or not is another matter entirely.
    Last edited by deanjo; 05-10-2014 at 11:37 PM.

  8. #18
    Join Date
    Aug 2012
    Posts
    411

    Default

    Quote Originally Posted by deanjo View Post
    What you don't seem to understand is that if the encoding/decoding is done via hardware or software the power savings by having a smaller file size that allows less hard drives to be used applies to both regardless.



    You have to realize that your use case is not the norm. The norm is what dictates if it is worth while for development efforts to be put into an implementation. No matter how careful and what uber settings you use, you will always have further degradations in IQ. If you can notice it on your equipment or not is another matter entirely.
    Incorrect, if I use standard profile H.264 that can be hardware-decoded, the file size becomes significantly larger to get a similar image quality. Compressing 8-bit media into 10-bit allows the encoder to save extra space as it handles color gradients significantly better. With the enhanced capability to combat banding, you can use a lower CRF value in combination with maximum compression settings to cut the file size down massively. My entire 10-bit H.264 archive is compressed with placebo with the CRF value low enough that image quality has not been decreased in any way that is noticeable. That archive currently takes up 2.1 TB. The original sources were several times larger than these, which could have taken many 3TB hard drives to hold.

    My use case is irrelevant. H.265 and VP9 are not the norm today, so why are hardware manufactures going to support it? Instead of choosing to support H.265 which has failed before it even released, in the same way Opus defeated the last proprietary next-gen audio codec and is now 'mainstream', it'd be more wise to select a codec that has even better technologies to develop around. As it stands, the reference H.265 codec isn't much better than H.264 10-bit with the latest x264.

  9. #19
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,584

    Default

    Quote Originally Posted by mmstick View Post
    Incorrect, if I use standard profile H.264 that can be hardware-decoded, the file size becomes significantly larger to get a similar image quality. Compressing 8-bit media into 10-bit allows the encoder to save extra space as it handles color gradients significantly better. With the enhanced capability to combat banding, you can use a lower CRF value in combination with maximum compression settings to cut the file size down massively.
    Using 10-bit on a 8-bit source only combats to some extent further banding in gradients. It is still however not as good as the original 8-bit source. That is just fact. It doesn't get better, it degrades as is the case with any other generation reduction encoding. Ripping a CD that was created from 128k mp3 to 256k mp3 doesn't magically make the sound better, it still degrades.

    My entire 10-bit H.264 archive is compressed with placebo with the CRF value low enough that image quality has not been decreased in any way that is noticeable.
    Ya the placebo effect it definitely working on you.

    My use case is irrelevant. H.265 and VP9 are not the norm today, so why are hardware manufactures going to support it? Instead of choosing to support H.265 which has failed before it even released, in the same way Opus defeated the last proprietary next-gen audio codec and is now 'mainstream', it'd be more wise to select a codec that has even better technologies to develop around. As it stands, the reference H.265 codec isn't much better than H.264 10-bit with the latest x264.
    H.265 is already present in many of this year devices. You can go out and buy consumer goods right now that support it. My new Samsung UN65HU8550F supports it out of the box (also supports VP8 btw as well). The new prosumer 4k cameras are also supporting h265.

  10. #20
    Join Date
    Sep 2013
    Posts
    125

    Default

    Quote Originally Posted by deanjo View Post
    Using 10-bit on a 8-bit source only combats to some extent further banding in gradients. It is still however not as good as the original 8-bit source. That is just fact. It doesn't get better, it degrades as is the case with any other generation reduction encoding.
    You missed the point. He said that using 10-bit colour depth and other advanced features most hardware decoders (not to mention encoders) do not support you can get comparable visual quality from the same source material at lower bitrates. That is also fact.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •