Announcement

Collapse
No announcement yet.

60 fps h264/HEVC playback with mpv

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • 60 fps h264/HEVC playback with mpv

    I should be receiving my supposedly HDMI 2.1, 120 Hz capable LG TV on Wednesday so I'm wondering if any of my existing hardware will be able to handle 60 fps playback of 4K h264/h265 files under mpv when using a HDMI 2.1 cable?

    My laptop is a i7-7700HQ with a Quadro M620. Might this be capable of 60 fps @ 4K h26X decoding and playback? I've also got a 8GB RX 580 but I don't currently have a decent machine for that to go in so I'm thinking of getting an AM4 socket machine to get me a team red box.

    If the Quadro M620 won't cut it, might I be able to get away with using the iGPU/APU or whatever its called on the Ryzen 5 3400g for this? I want to minimise or eliminate fans so I'd prefer to use that or a passively cooled GPU rather than the RX580 or anything with fans. I would prefer to use AMD GPUs over NV ones.
    Last edited by danboid; 10 January 2021, 06:46 AM.

  • #2
    To answer my own question, it seems the RX6000 series are the only AMD GPUs that support HDMI 2.1. Does anyone know if 4K 120 Hz is supported under the Linux radeon (on the RX6800) driver yet or is it currently Windows only?
    Last edited by danboid; 10 January 2021, 04:33 PM.

    Comment


    • #3
      I'll be able to get a RTX 3060 for less than half of the price of a RX6000 card, come March.

      Has anyone tried playing any 60 fps vids (such as these https://4kmedia.org ) under Linux mpv on a RTX 3060 Ti?

      Comment


      • #4
        I have been using h264/h265 4K @ 60fps for years.

        i7-7700HQ should be able to decode that. h265 barely, but still I think it might be enough.

        As of the hardware decode, that depended strongly if you want HDR or just 8-bit content. A lot of older GPU hardware doesn't support HDR decode. But CPU can still do it usually. On my TR2950X, mpv decoding HDR 4K h265 @ 60fps (It is 50Mbps bitrate average), hovers around 380%, but does sometimes go to 410%.

        For non-HDR content, it often uses less, like 250%, when when using my GPU (AMD R9 Fury, FIJI, GFX8, which is about 5 years old), is sometimes it is below 100%.

        On Linux desktop you would easily do HEVC HDR using any GPU from last 3 years with hardware decode.

        As of Quadro / Nvidia on Linux, I don't have any experience with them over last 15 years.

        Comment

        Working...
        X