Announcement

Collapse
No announcement yet.

FFmpeg Lands NVDEC-Accelerated H.264 Decoding

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by caligula View Post
    On desktop, I don't want to have two GPUs just to 1) play a game + another to 2) encode & broadcast my gameplay to web.
    Or, for those not interested in streaming:
    -not have to buy a mid-end (or better) CPU to see high-res movies
    -not have a GPU power up its 3D cores, and fans, make noise and waste power just for a movie or whatever.
    -laptops. both the above combined, plus we add a battery that we don't want to waste on bullshit when with an ASIC you could do equally well with a fraction of the power/heat/resource budget.

    Really, anyone that thinks hardware acceleration for media is unnecessary is totally nuts, or out of touch with the 21st century.

    Comment


    • #32
      Originally posted by caligula View Post
      Not long ago high end Intel CPUs couldn't decode 1080p h.264.
      You have to go back pretty far for that to be the case. I regularly watched 24/30 fps 1080p videos on a Pentium 4 2.8 GHz. And yes, definitely H.264.

      You can't just ignore 13 years semiconductor manufacturing progress, or are you prepared to claim 1000x efficiency benefit using ASIC technology of 2004?

      Originally posted by caligula View Post
      GPUs and CPUs could decode (and encode), but why bother
      The point is versatility. Same reason there are CPU-based decoders. You wouldn't prefer to use one, but if you're not on battery power and have no support for your hardware decoder, then you would.

      And while GPU software decoders wouldn't have the same straight-line speed as a hardware decoder, they also wouldn't use much of the GPU's resources. This could allow for many streams to be decoded, concurrently. I actually have just such a use case.
      Last edited by coder; 16 November 2017, 01:18 AM.

      Comment

      Working...
      X