Announcement

Collapse
No announcement yet.

AMD Announces Navi 14 Based Radeon RX 5500 Series

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Terr-E View Post
    If , what would be the advantage over simply buying an RX 570 ? Will it be cheaper ? Run cooler ? Very curious as to what this card will bring to the table…
    AMD is claiming performance between the 580 and 590, with considerably better power efficiency than those cards.

    Comment


    • #12
      Originally posted by smitty3268 View Post

      AMD is claiming performance between the 580 and 590, with considerably better power efficiency than those cards.
      With only 22 compute units vs 36 in 580 I have my doubts but we will see.

      Comment


      • #13
        Originally posted by tuxd3v View Post

        Maybe to be more competitive with the NVIDIA GTX 1600 series.. with low power consumption..?
        Any way,
        I think that the RX500 is a very nice peace of hardware, very good OpenCL 2.0, the problem is that is not a universal card( because it needs PCIe 3.0 atomic operations supported in the CPU and in the motherboard, to be compliant.. ).

        This requirement in not good( PCIe 3.0 atomics.. ),
        Maybe the RX 5500 series don't have that limitation..?

        After all Nvidia cards are universal, you can trow a recent Nvidia card into a pcie 1.1/PCIe 2.0/PCIe 3.0 and it will work there, this is something very very valuable when people buy hardware, Nvidia got it right..
        Sometimes you do have old machines around you, and if they work well, why should you be in need to change that..( some times they even make part of a bigger implementation, and you don't want to mess too much there, or you end in the need for a new set of projects with down times and losses, guaranteed.. )?!
        right, you change the hardware that brakes( ..and here is were NVidia got it right.. ).
        PCIe Atomics are available since Haswell on Intel (2013), Supporting 6 year old PCs is normally plenty enough, we're talking new hardware here.

        PCIe Atomics are afaik available since Ryzen on AMD (2017). That's surely a lot less, but let's face it: AMD was really really lagging behind in that period. Like really really behind. Bulldozer never performed especially well. I think it's not really worthwhile relying on that old tech definitions, when you're doing hardware right now, so building chips for todays standards is just more sensible.

        Comment


        • #14
          bridgman Is the Navi 14 (aka Radeon RX 5500) capable of 8k for desktop use? You pointed out that Navi is optimized for 4k and 8k, but nothing is given except FullHD like it would be a Polaris card of nearly 4 years ago. After product announcement I hope such basic info can be given?
          What about the mobile version - is it a Navi APU for the desktop and if so - is it fully capable of 4k and 8k or is it a chipset limited version as Raven Ridge and its (too many) descendants?
          Would be great to buy an AMD system this year - but without 8k my Haswell with iGPU would do the same job ... so I really hope these are cards/APUs to be still usable in 2020+.
          And by the way - any hope to get Navi running 8k on Linux this year?

          Comment


          • #15
            Originally posted by Hibbelharry View Post
            PCIe Atomics are afaik available since Ryzen on AMD (2017). That's surely a lot less, but let's face it: AMD was really really lagging behind in that period. Like really really behind. Bulldozer never performed especially well. I think it's not really worthwhile relying on that old tech definitions, when you're doing hardware right now, so building chips for todays standards is just more sensible.
            I am not saying otherwise,
            But when I buy hardware, I think in buying it, with a longevity in mind..

            Besides,
            There are still a lot of production hardware out there, that is a lot older, and you need to replace, some spare parts, from time to time.. how will you buy new products for that machines, if the products does not support that technologies?

            Because if you only build for today or tomorrow, you doesn't support yesterday products( this is a very big problem.. ).. you only have one small market, the bigger pie, you are not playing into it.
            NVidia and others will capitalise on that, selling hardware that is supported today, and also support past technologies too, so they have all markets, without competition..

            I think you will sell only a couple of products, but you will not succeed further in your business.
            No longevity plan for hardware, will turn you in a very low tier Company..

            Comment


            • #16
              Well, I think you should go with 16k resolution...the number is bigger..

              Comment


              • #17
                Originally posted by tuxd3v View Post

                Maybe to be more competitive with the NVIDIA GTX 1600 series.. with low power consumption..?
                Any way,
                I think that the RX500 is a very nice peace of hardware, very good OpenCL 2.0, the problem is that is not a universal card( because it needs PCIe 3.0 atomic operations supported in the CPU and in the motherboard, to be compliant.. ).

                This requirement in not good( PCIe 3.0 atomics.. ),
                Maybe the RX 5500 series don't have that limitation..?

                After all Nvidia cards are universal, you can trow a recent Nvidia card into a pcie 1.1/PCIe 2.0/PCIe 3.0 and it will work there, this is something very very valuable when people buy hardware, Nvidia got it right..
                Sometimes you do have old machines around you, and if they work well, why should you be in need to change that..( some times they even make part of a bigger implementation, and you don't want to mess too much there, or you end in the need for a new set of projects with down times and losses, guaranteed.. )?!
                right, you change the hardware that brakes( ..and here is were NVidia got it right.. ).
                No! Seriously guy, why would anybody buy a modern performance GPU card to use in an ancient computer?

                Comment


                • #18
                  Originally posted by Neuro-Chef View Post
                  Well, it most likely won't handle AV1, at least the RX 5700 (XT) does not.

                  And with only RX 570 like performance it should be a little bit cheaper.
                  Well that would suck because AV1 was on my mind here. Hopefully details can be scraped up soon but their is little sense in buying a GPU that can’t do AV1 decode in hardware.

                  Comment


                  • #19
                    Originally posted by tuxd3v View Post
                    Because if you only build for today or tomorrow, you doesn't support yesterday products( this is a very big problem.. ).. you only have one small market, the bigger pie, you are not playing into it.

                    NVidia and others will capitalise on that, selling hardware that is supported today, and also support past technologies too, so they have all markets, without competition..

                    I think you will sell only a couple of products, but you will not succeed further in your business.
                    No longevity plan for hardware, will turn you in a very low tier Company..
                    Nvidia does have other problems, their hardware generally does not age well. As soon as a card gets into legacy support, things get crappy rather quickly, with no support for newer X11 versions, no support for newer technology past X11, compiler hazzles, kernel update woes...

                    You just can't have it all.

                    Comment


                    • #20
                      Originally posted by wizard69 View Post
                      Well that would suck because AV1 was on my mind here. Hopefully details can be scraped up soon but their is little sense in buying a GPU that can’t do AV1 decode in hardware.
                      The RX5700 can't do AV1, so I guess this will neither be compatible. This is kind of the same generation of silicon. Nvidia can't do it either. I think we will see no integrated av1 capable hardware before the next bigger chip refreshes.

                      The only ASIC I know up to now is this one:

                      Comment

                      Working...
                      X