New Intel Mesa Driver Patches Implement AV1 Decode For Vulkan Video

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Quackdoc
    replied
    Originally posted by marlock View Post
    i imagine gstreamer will probably query for available vulkan video extensions (which in turn should only be available if there's a compliant driver and hardware in the machine) and fallback to its other code paths when that's not available

    this would likely allow gstreamer devs to focus on the codepath selection logic, the codec APIs translation/generalization and specifics, software fallback decoding logic... while optimal code for specific hardware becomes driver-level work behind a common cross-vendor API with proper definition, compliance testing, etc

    afaik gpus that don't have an asic for video encoding/decoding might still implement it in their driver over the available shaders and whatnot and expose this under the vulkan video API, thus making this a natural "best available option" API

    and i bet lavapipe will also end up implementing some vulkan vidro over CPU codepaths that may leverage AVX and such as available and applicable

    which means at some point targeting this API might be all you really need in gstreamer and the likes of it, game engines, or even directly in apps

    imho probably the biggest benefit of this being done as a vulkan API extension (beyond having proper API governance, documentation, visibility, etc) is that encoding/decoding codepaths become driver development, so this gets done by devs who understand the hardware and are better positioned to optimize for it, while app logic stays in userland with app devs
    This is how general decodebin/encodebin will work most likely, but with gstreamer if you want vulkan decode/encode you just use something like filesrc ! vkdecodebin ! videoconvert ! whatever here ! video convert ! vkh264enc ! mux stuff....

    Leave a comment:


  • marlock
    replied
    i imagine gstreamer will probably query for available vulkan video extensions (which in turn should only be available if there's a compliant driver and hardware in the machine) and fallback to its other code paths when that's not available

    this would likely allow gstreamer devs to focus on the codepath selection logic, the codec APIs translation/generalization and specifics, software fallback decoding logic... while optimal code for specific hardware becomes driver-level work behind a common cross-vendor API with proper definition, compliance testing, etc

    afaik gpus that don't have an asic for video encoding/decoding might still implement it in their driver over the available shaders and whatnot and expose this under the vulkan video API, thus making this a natural "best available option" API

    and i bet lavapipe will also end up implementing some vulkan vidro over CPU codepaths that may leverage AVX and such as available and applicable

    which means at some point targeting this API might be all you really need in gstreamer and the likes of it, game engines, or even directly in apps

    imho probably the biggest benefit of this being done as a vulkan API extension (beyond having proper API governance, documentation, visibility, etc) is that encoding/decoding codepaths become driver development, so this gets done by devs who understand the hardware and are better positioned to optimize for it, while app logic stays in userland with app devs
    Last edited by marlock; 29 December 2024, 11:28 PM.

    Leave a comment:


  • Quackdoc
    replied
    Originally posted by polarathene View Post

    Igalia seems to have plans for gstreamer and implementing mobile support based on these slides?



    So perhaps there's some way that's wired up with mobile devices that offer vulkan drivers? They also talk about a zink experiment to leverage vulkan video.
    gstreamer support is underway, but that's separate from mobile support. They don't really say how they implement this, but if I were to take a guess, and mind you, this is with very little context, I presume it would be best to implement this as a vulkan layer. Keep in mind that GPU and HWaccel can be changed non dependantly on eachother which means you can't make your vulkan driver "support whatever asic you have". Rather it might make sense if they use layers or something like that to bridge to asics or more likely, bridge to v4l2-m2m.

    Leave a comment:


  • polarathene
    replied
    Originally posted by Quackdoc View Post

    vulkan video is for GPU hwaccel asics. ARM and Riscv SBCs hwaccelerators are typically not part of the GPU. so in the first place, vulkan video accel doesn't really make sense for them.
    Igalia seems to have plans for gstreamer and implementing mobile support based on these slides?



    So perhaps there's some way that's wired up with mobile devices that offer vulkan drivers? They also talk about a zink experiment to leverage vulkan video.

    Leave a comment:


  • Pheoxy
    replied
    Originally posted by Quackdoc View Post

    vulkan video is for GPU hwaccel asics. ARM and Riscv SBCs hwaccelerators are typically not part of the GPU. so in the first place, vulkan video accel doesn't really make sense for them.
    Ah, ah well I didn't really expect much on current devices in arm64 for hardware acceleration anyways. Anything I use that needs acceleration usually gets pointed to my amd64 machines anyways.

    Leave a comment:


  • Quackdoc
    replied
    Originally posted by Pheoxy View Post

    I haven't seen anything about this. Vulkan video extensions don't seem to get much attention in details like this.

    Do different architecture also have some affect I should be accounting for or be careful of?
    vulkan video is for GPU hwaccel asics. ARM and Riscv SBCs hwaccelerators are typically not part of the GPU. so in the first place, vulkan video accel doesn't really make sense for them.

    Leave a comment:


  • Pheoxy
    replied
    Originally posted by Quackdoc View Post

    well as long as you don't plan on using your scripts on arm/risc-v sbcs using their hwaccelerated stuff it should be fine. If you use vulkan processing like libplacebo you should also be getting more perf there.
    I haven't seen anything about this. Vulkan video extensions don't seem to get much attention in details like this.

    Do different architecture also have some affect I should be accounting for or be careful of?

    Leave a comment:


  • darkbasic
    replied
    av9 next please.

    Leave a comment:


  • Quackdoc
    replied
    Originally posted by Pheoxy View Post

    Exactly, it's just a more standardized way to access the hardware layer so vendors do not have to also create their own interface layer allowing them to focus more on the driver instead and saves on duplicated work.

    I'm very interested in this as it makes scripts and handling workflows easier to manage and I don't have to do more work on verifying configuration settings between vendors with tools like ffmpeg.
    well as long as you don't plan on using your scripts on arm/risc-v sbcs using their hwaccelerated stuff it should be fine. If you use vulkan processing like libplacebo you should also be getting more perf there.

    Leave a comment:


  • Pheoxy
    replied
    Originally posted by Quackdoc View Post

    The vulkan extension is just an interface to the hwdecode asics on gpu. no gpus will get expanded support.
    Exactly, it's just a more standardized way to access the hardware layer so vendors do not have to also create their own interface layer allowing them to focus more on the driver instead and saves on duplicated work.

    I'm very interested in this as it makes scripts and handling workflows easier to manage and I don't have to do more work on verifying configuration settings between vendors with tools like ffmpeg.
    Last edited by Pheoxy; 26 December 2024, 08:04 AM.

    Leave a comment:

Working...
X