Announcement

Collapse
No announcement yet.

VDPAU API H.265 / HEVC Decoding Lands In Mainline

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Kano View Post
    Btw. H265 is only hard to decode with high bitrates used for 4k res. If your CPU is no Atom or similar then sw decoding for FHD is certainly possible. And if you really have 4k movies you usually want to watch em on a 4k TV. There HDMI 2.0 is a requirement - only expensive ones have got DP 1.3 as well. Right now only GTX 960/970/980 can be used for this - later this year Intel Skylake. Would like to know when AMD wants to wake up - basically they should upgrade PS4/Xbox One hardware as well or they really fall behind.
    It's not up to AMD to update those platforms. I'm sure they would love to put in the upcoming Carrizo with H.265 hardware encode/decode built-in. That will be up to Sony and Microsoft.

    Comment


    • #22
      Originally posted by Kivada View Post
      They are working on this for the OSS Radeon drivers, so that people with older hardware will be able to use the 3D engine to handle unsupported codecs http://www.x.org/wiki/RadeonFeature/
      UVD has been here for a very long time already. I'm sceptic the unsupported GPU's pack enough juice for useful shader-driven decoding

      Comment


      • #23
        Originally posted by Kano View Post
        @ParticleBoard
        If you paid 350$ for a GTX 760 then you must be very stupid.
        $350 taxes in to my door. Also thats CAD not USD. Yes $350 USD would be stupid.

        Comment


        • #24
          Where was your GTX 960 marketing bug btw? I would return the old Kepler card.

          Comment


          • #25
            Originally posted by agd5f View Post
            The problem is certain parts of the decode pipeline are not well suited to parallel shader type operations. As such you can only accelerate certain parts of the pipeline effectively on shaders.
            A APU should handle this or not?

            Semi OT: Dies someone know why amd stop working on libjpeg-turbo? At the beginn they start working with there opencl work (with the cpus in mind) after some month AMD stop working on it, so its never get upstream.

            Comment


            • #26
              Originally posted by nanonyme View Post
              UVD has been here for a very long time already. I'm sceptic the unsupported GPU's pack enough juice for useful shader-driven decoding
              It's nothing compared to 3D graphics. If it can run Plants vs Zombies, it can decode simple video files...

              Comment


              • #27
                Originally posted by eydee View Post
                It's nothing compared to 3D graphics. If it can run Plants vs Zombies, it can decode simple video files...
                Such fallacy.

                Comment


                • #28
                  Originally posted by ParticleBoard View Post
                  ... I'm not buying the 900 series because of this and if nobody else cares then please keep nvidia's marketing team gainfully employed and rolling the the cocaine/whores/whatever else marketers do.
                  Please tell me the difference for nVIDIA's marketing team between buying a 760 and buying a 960 (NOT affected 970).

                  Comment


                  • #29
                    Originally posted by curaga View Post
                    Such fallacy.
                    Why? Does anyone hold a gun against the programmer's head? Is he forbidden to write whatever code he wants? Did you know that in theory you are even allowed to use only 1 single unit of the GPU and run a single thread? Doesn't work and not efficient is two different things. (You can run Windows 7 on a Pentium II computer, you know. It's not efficient, but possible. And it's faster than running it on a Pentium 1...)

                    You don't have to protect anyone, I did not even attack anyone. We are talking about technical possibilities. When it comes to decoding stuff, CPU + very small improvement is still more than CPU + noting. This is simple math. If the GPU encodes only 1 frame of the whole movie, it is still a step forward. But this is of course not the case, much better result can be achieved, otherwise it wasn't in active development. People don't waste time coding stuff for small percentages.

                    Comment


                    • #30
                      Originally posted by Kivada View Post
                      They are working on this for the OSS Radeon drivers, so that people with older hardware will be able to use the 3D engine to handle unsupported codecs http://www.x.org/wiki/RadeonFeature/
                      Do you have something more concrete than that chart, which has said WIP since years? Cos the thing is, such a decoder wouldn't be radeon-specific and there's been no indication anywhere that any shader-based decoders are in development. So if you have something, please share.

                      Originally posted by eydee View Post
                      otherwise it wasn't in active development.
                      Unless Kivada has something more than that chart, it isn't. The only HEVC work I'm aware of is in libva-intel-driver, but that's for the gen9 (Skylake) dedicated decoder, not anything shader-based.

                      Comment

                      Working...
                      X