Announcement

Collapse
No announcement yet.

Better AMD Radeon VCE Video Encode Performance Coming To Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Artim
    replied
    Originally posted by Go_Vulkan View Post
    Well, home office apparently is no problem. Be it my camera, my CPU, my GPU, they all encode on the fly without any issue. SVT-AV1 lifts up the quality level.

    Regarding streamers, they often exaggerate their own importance, thinking they are the centre of the world. How many is "enough" companies who would actually be streaming and cannot afford a proper setup, with a normal gaming CPU like the Ryzen 5900X?
    Ever heard the magic word "efficiency"? Home office works well since pretty much every mayor video conferencing software defaults to h.264 which almost why device supports in hardware. Only Cisco is integrating AV1 for desktop sharing and only for content with lots of motion and only on desktop computers.

    And it doesn't matter if a company could afford "a proper setup" (thinking that would be a gaming CPU is just hilarious, a proper server CPU would be much more like it), but this just had to be efficient and fast since they also have to pay for the power to drive them. And software encoding will never be as efficient as hardware encoding, the drawbacks in quality are minor enough that the majority of people wouldn't even notice. And do you actually think streaming companies like Google, Netflix, Amazon and what they are called would actually turn down proper hardware encoding lo of the latest royalty-free codec if that would mean that they can use it over their current, less efficient (in terms of quality relative to bitrate), when that doesn't include much higher power consumption (given that they can currently encode everything in hardware), result in saved storage capacity on their servers (and the servers of their CDNs), higher possible quality for their consumers with bad or volume limited internet connection? Then tell me why Google spent years for building a much more efficient VP9 encoder, why they already try to ship as much AV1 as their servers can deliver, why they made it mandatory for the new Android TV and why Netflix helped developing SVT-AV1 to be able to use something much more efficient as the reference encoder as long as there is no dedicated hardware? The answer to all is: efficiency.

    Leave a comment:


  • Go_Vulkan
    replied
    Well, home office apparently is no problem. Be it my camera, my CPU, my GPU, they all encode on the fly without any issue. SVT-AV1 lifts up the quality level.

    Regarding streamers, they often exaggerate their own importance, thinking they are the centre of the world. How many is "enough" companies who would actually be streaming and cannot afford a proper setup, while even among gamers, the Ryzen 5900X is a typical CPU?
    Last edited by Go_Vulkan; 03 January 2022, 04:39 PM.

    Leave a comment:


  • Artim
    replied
    Originally posted by Go_Vulkan View Post
    It is no wonder that OBS studio adds realtime AV-encoding via CPU (SVT-AV1, https://www.phoronix.com/scan.php?pa...io-27.2-Beta-1), because this is the way to go. By the way, if everybody would be streaming, nobody could watch anymore, right? How many people are actually doing this?
    Wrong. They did it because there aren't any hardware codecs for AV1, simple as that. Plus that wouldn't be anything they would really need to implement, just use the APIs.

    And who's streaming? Besides about everyone on Twitch, there are enough companies streaming their content. Just look at YouTube and what they have to encode every minute, even though those are I've time encodings. They just built themselves a VP9 encoder, they probably will do the same for AV1.

    Plus it's quite funny to ask "who's streaming" in a time where many people have to do home office and thus make video calls which have to be encoded.

    Leave a comment:


  • Go_Vulkan
    replied
    It is no wonder that OBS studio adds realtime AV-encoding via CPU (SVT-AV1, https://www.phoronix.com/scan.php?pa...io-27.2-Beta-1), because this is the way to go. By the way, if everybody would be streaming, nobody could watch anymore, right? How many people are actually doing this?

    Leave a comment:


  • mdedetrich
    replied
    Originally posted by Go_Vulkan View Post
    If we're talking about serious video encoding, you will get a 16 core / 32 threads AMD Ryzen 5950X or (even better) Threadripper. GPU Hardware encoders suck quality-wise. Encoding is a one-time job, whereas the result will be delivered to millions of users.
    That is not the only use case i.e. you are conveniently ignoring streaming, i.e. the specific personal one I had to deal with was encoding real time security footage. In that case, NVidia GPU's was the way to go (at least if you didn't want to break the bank).

    Leave a comment:


  • Go_Vulkan
    replied
    If we're talking about serious video encoding, you will get a 16 core / 32 threads AMD Ryzen 5950X or (even better) Threadripper. GPU Hardware encoders suck quality-wise. Encoding is a one-time job, whereas the result will be delivered to millions of users.
    Last edited by Go_Vulkan; 02 January 2022, 09:04 PM.

    Leave a comment:


  • mdedetrich
    replied
    Originally posted by sophisticles View Post

    It is not in AMD's interest to take hardware encoding seriously. AMD has spent years promoting the notion of "more cores" and one of the few things that benefit from the number of cores/threads AMD offers in the high end is video encoding.

    Now imagine if AMD were to release a cheap APU with a great hardware encoder, that would cannibalize AMD's high end sales.

    Intel has more motivation because they don't have as many cores at the high end nor do they have the same power consumption.

    NVIDIA, since they don't make any CPU's has even more motivation to take market share by developing good hardware encoders.
    I think you are over exaggerating the cannibalisation here, while its true you can just brute force encoding with lots of CPU cores its an extremely cost inefficient way to do so and in reality its an extremely minor reason to use AMD CPU's. Realistically AMD is not gaining any real sales by encouraging people to buy their CPU's for video encoding.

    The matter of fact is if you do video encoding on any serious level you buy an NVidia GPU since it has dedicated hardware silicon to perform this encoding, x86 processors do not.

    This also applies in the server/enterprise, EPYC servers are useful for webservers where you are handling a lot of standard business logic but if you are doing any kind of encoding you don't use AMD CPU's for this.

    Ultimately I think the real reason why AMD's hardware encoding on GPU's is subpar is simply because it wasn't a financial priority (remember we are talking about a company that almost went bankrupt ~7 years ago here). At that time video encoding was a pretty niche thing and AMD GPU had very little penetration in enterprise, it was pretty much all focused on budget value gamers.

    If I was them in that situation I also wouldn't focus on video encoding, instead I would get the basic product right. Now its different though, and they really should hire someone thats dedicated to video encoding if they haven't already done so.
    Last edited by mdedetrich; 01 January 2022, 07:21 PM.

    Leave a comment:


  • sophisticles
    replied
    Originally posted by mdedetrich View Post
    I would like AMD to start taking video encoding seriously, at this point compared to the competition (NVidia) its really embarrassing and it is yet another reason on the bucket list of why NVidia is ahead of AMD in a lot of more professional areas.
    It is not in AMD's interest to take hardware encoding seriously. AMD has spent years promoting the notion of "more cores" and one of the few things that benefit from the number of cores/threads AMD offers in the high end is video encoding.

    Now imagine if AMD were to release a cheap APU with a great hardware encoder, that would cannibalize AMD's high end sales.

    Intel has more motivation because they don't have as many cores at the high end nor do they have the same power consumption.

    NVIDIA, since they don't make any CPU's has even more motivation to take market share by developing good hardware encoders.

    Leave a comment:


  • theriddick
    replied
    NVIDIA drivers have a big advantage when it comes to Wine/Proton gaming because their Linux drivers are based off their windows binary drivers unlike AMD.

    If AMD EVER adopts a open-source driver solution for windows; then it will be great news for Linux as developers will code/test against that thus helping Wine/Proton stuff under Linux for AMD.

    At least that is what I think/hope; but I'm probably wrong.

    Leave a comment:


  • shanedav4
    replied
    Originally posted by mdedetrich View Post
    I would like AMD to start taking video encoding seriously, at this point compared to the competition (NVidia) its really embarrassing and it is yet another reason on the bucket list of why NVidia is ahead of AMD in a lot of more professional areas.
    I hope they also make it easier to set up. AMD video encoding has always been a nightmare to get configured, where Nvidia all you need to do is install the right packages and it just works. On Linux AMD has been gaming only unless you like pulling your hair out getting the encoding working. I hope they are going to do more than a half a#sed attempt.

    Leave a comment:

Working...
X