New Intel Mesa Driver Patches Implement AV1 Decode For Vulkan Video

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • Quackdoc
    Senior Member
    • Oct 2020
    • 5053

    #11
    Originally posted by polarathene View Post

    Igalia seems to have plans for gstreamer and implementing mobile support based on these slides?



    So perhaps there's some way that's wired up with mobile devices that offer vulkan drivers? They also talk about a zink experiment to leverage vulkan video.
    gstreamer support is underway, but that's separate from mobile support. They don't really say how they implement this, but if I were to take a guess, and mind you, this is with very little context, I presume it would be best to implement this as a vulkan layer. Keep in mind that GPU and HWaccel can be changed non dependantly on eachother which means you can't make your vulkan driver "support whatever asic you have". Rather it might make sense if they use layers or something like that to bridge to asics or more likely, bridge to v4l2-m2m.

    Comment

    • marlock
      Senior Member
      • Nov 2018
      • 435

      #12
      i imagine gstreamer will probably query for available vulkan video extensions (which in turn should only be available if there's a compliant driver and hardware in the machine) and fallback to its other code paths when that's not available

      this would likely allow gstreamer devs to focus on the codepath selection logic, the codec APIs translation/generalization and specifics, software fallback decoding logic... while optimal code for specific hardware becomes driver-level work behind a common cross-vendor API with proper definition, compliance testing, etc

      afaik gpus that don't have an asic for video encoding/decoding might still implement it in their driver over the available shaders and whatnot and expose this under the vulkan video API, thus making this a natural "best available option" API

      and i bet lavapipe will also end up implementing some vulkan vidro over CPU codepaths that may leverage AVX and such as available and applicable

      which means at some point targeting this API might be all you really need in gstreamer and the likes of it, game engines, or even directly in apps

      imho probably the biggest benefit of this being done as a vulkan API extension (beyond having proper API governance, documentation, visibility, etc) is that encoding/decoding codepaths become driver development, so this gets done by devs who understand the hardware and are better positioned to optimize for it, while app logic stays in userland with app devs
      Last edited by marlock; 29 December 2024, 11:28 PM.

      Comment

      • Quackdoc
        Senior Member
        • Oct 2020
        • 5053

        #13
        Originally posted by marlock View Post
        i imagine gstreamer will probably query for available vulkan video extensions (which in turn should only be available if there's a compliant driver and hardware in the machine) and fallback to its other code paths when that's not available

        this would likely allow gstreamer devs to focus on the codepath selection logic, the codec APIs translation/generalization and specifics, software fallback decoding logic... while optimal code for specific hardware becomes driver-level work behind a common cross-vendor API with proper definition, compliance testing, etc

        afaik gpus that don't have an asic for video encoding/decoding might still implement it in their driver over the available shaders and whatnot and expose this under the vulkan video API, thus making this a natural "best available option" API

        and i bet lavapipe will also end up implementing some vulkan vidro over CPU codepaths that may leverage AVX and such as available and applicable

        which means at some point targeting this API might be all you really need in gstreamer and the likes of it, game engines, or even directly in apps

        imho probably the biggest benefit of this being done as a vulkan API extension (beyond having proper API governance, documentation, visibility, etc) is that encoding/decoding codepaths become driver development, so this gets done by devs who understand the hardware and are better positioned to optimize for it, while app logic stays in userland with app devs
        This is how general decodebin/encodebin will work most likely, but with gstreamer if you want vulkan decode/encode you just use something like filesrc ! vkdecodebin ! videoconvert ! whatever here ! video convert ! vkh264enc ! mux stuff....

        Comment

        Working...
        X