Announcement

Collapse
No announcement yet.

AMD Announces Navi 14 Based Radeon RX 5500 Series

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • artivision
    replied
    Originally posted by fuzz View Post

    It works with VAAPI (and has for some time). It's limited but there is a "way". I've used it successfully for real-time encoding without an issue.
    Can you give me a command line?

    Leave a comment:


  • fuzz
    replied
    Originally posted by artivision View Post
    Is there any way to encode h264 videos with open source Radeon yet? Or is it completely trash as always?
    It works with VAAPI (and has for some time). It's limited but there is a "way". I've used it successfully for real-time encoding without an issue.

    Leave a comment:


  • artivision
    replied
    Is there any way to encode h264 videos with open source Radeon yet? Or is it completely trash as always?

    Leave a comment:


  • MadeUpName
    replied
    The only place I have found real info on VCN 2.0 is this page but it doesn't mention AV1 at all.

    Leave a comment:


  • Hibbelharry
    replied
    Originally posted by wizard69 View Post
    Well that would suck because AV1 was on my mind here. Hopefully details can be scraped up soon but their is little sense in buying a GPU that can’t do AV1 decode in hardware.
    The RX5700 can't do AV1, so I guess this will neither be compatible. This is kind of the same generation of silicon. Nvidia can't do it either. I think we will see no integrated av1 capable hardware before the next bigger chip refreshes.

    The only ASIC I know up to now is this one:

    Leave a comment:


  • Hibbelharry
    replied
    Originally posted by tuxd3v View Post
    Because if you only build for today or tomorrow, you doesn't support yesterday products( this is a very big problem.. ).. you only have one small market, the bigger pie, you are not playing into it.

    NVidia and others will capitalise on that, selling hardware that is supported today, and also support past technologies too, so they have all markets, without competition..

    I think you will sell only a couple of products, but you will not succeed further in your business.
    No longevity plan for hardware, will turn you in a very low tier Company..
    Nvidia does have other problems, their hardware generally does not age well. As soon as a card gets into legacy support, things get crappy rather quickly, with no support for newer X11 versions, no support for newer technology past X11, compiler hazzles, kernel update woes...

    You just can't have it all.

    Leave a comment:


  • wizard69
    replied
    Originally posted by Neuro-Chef View Post
    Well, it most likely won't handle AV1, at least the RX 5700 (XT) does not.

    And with only RX 570 like performance it should be a little bit cheaper.
    Well that would suck because AV1 was on my mind here. Hopefully details can be scraped up soon but their is little sense in buying a GPU that can’t do AV1 decode in hardware.

    Leave a comment:


  • wizard69
    replied
    Originally posted by tuxd3v View Post

    Maybe to be more competitive with the NVIDIA GTX 1600 series.. with low power consumption..?
    Any way,
    I think that the RX500 is a very nice peace of hardware, very good OpenCL 2.0, the problem is that is not a universal card( because it needs PCIe 3.0 atomic operations supported in the CPU and in the motherboard, to be compliant.. ).

    This requirement in not good( PCIe 3.0 atomics.. ),
    Maybe the RX 5500 series don't have that limitation..?

    After all Nvidia cards are universal, you can trow a recent Nvidia card into a pcie 1.1/PCIe 2.0/PCIe 3.0 and it will work there, this is something very very valuable when people buy hardware, Nvidia got it right..
    Sometimes you do have old machines around you, and if they work well, why should you be in need to change that..( some times they even make part of a bigger implementation, and you don't want to mess too much there, or you end in the need for a new set of projects with down times and losses, guaranteed.. )?!
    right, you change the hardware that brakes( ..and here is were NVidia got it right.. ).
    No! Seriously guy, why would anybody buy a modern performance GPU card to use in an ancient computer?

    Leave a comment:


  • tuxd3v
    replied
    Well, I think you should go with 16k resolution...the number is bigger..

    Leave a comment:


  • tuxd3v
    replied
    Originally posted by Hibbelharry View Post
    PCIe Atomics are afaik available since Ryzen on AMD (2017). That's surely a lot less, but let's face it: AMD was really really lagging behind in that period. Like really really behind. Bulldozer never performed especially well. I think it's not really worthwhile relying on that old tech definitions, when you're doing hardware right now, so building chips for todays standards is just more sensible.
    I am not saying otherwise,
    But when I buy hardware, I think in buying it, with a longevity in mind..

    Besides,
    There are still a lot of production hardware out there, that is a lot older, and you need to replace, some spare parts, from time to time.. how will you buy new products for that machines, if the products does not support that technologies?

    Because if you only build for today or tomorrow, you doesn't support yesterday products( this is a very big problem.. ).. you only have one small market, the bigger pie, you are not playing into it.
    NVidia and others will capitalise on that, selling hardware that is supported today, and also support past technologies too, so they have all markets, without competition..

    I think you will sell only a couple of products, but you will not succeed further in your business.
    No longevity plan for hardware, will turn you in a very low tier Company..

    Leave a comment:

Working...
X