Announcement

Collapse
No announcement yet.

Radeon "RADV" Vulkan Driver Adds Experimental Support For Sienna Cichlid

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Radeon "RADV" Vulkan Driver Adds Experimental Support For Sienna Cichlid

    Phoronix: Radeon "RADV" Vulkan Driver Adds Experimental Support For Sienna Cichlid

    Following AMD publishing the open-source Linux driver patches for "Sienna Cichlid" (Navi 2) that included the RadeonSI OpenGL driver changes, the RADV Vulkan driver has now tacked on support for this next-generation Navi GPU...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    I'd suspect with ACO also for RadeonSI, the Sienna Cichlid launch would be much smoother in terms of system stability. It'd be really great if it was within grasp.
    At least it looks like with Mesa support ready months ahead, it could be a much better launch than Navi 10 anyway.

    Comment


    • #3
      Originally posted by aufkrawall View Post
      I'd suspect with ACO also for RadeonSI, the Sienna Cichlid launch would be much smoother in terms of system stability. It'd be really great if it was within grasp.
      At least it looks like with Mesa support ready months ahead, it could be a much better launch than Navi 10 anyway.
      AMD's Day 1 (Linux) GPU support has always been an iffy at best prospect, especially for an informed consumer. All these articles lately are making me cautiously optimistic for SC.

      I just hope this generation finally has an RX 580 replacement in the $150-$250 range.

      Comment


      • #4
        Originally posted by skeevy420 View Post

        AMD's Day 1 (Linux) GPU support has always been an iffy at best prospect, especially for an informed consumer. All these articles lately are making me cautiously optimistic for SC.

        I just hope this generation finally has an RX 580 replacement in the $150-$250 range.
        I hope for this too. For half a decade the gpu market has been extremely overpriced and not only that, but the vast majority of offers have overinflated TDPs. And really, what concerns me more is the TDP issue, not the cost issue. I remember in January of 2008 i got an ATI 3870 with 512MB of GDDR4 ram. I paid around 210 euros then. Out of all the gpus i ever bought, that was my favourite purchase. Around 105W TDP, almost silent, could play every game i could throw at it at max settings @1050p. I just loved that gpu so much. Then something happened and it was like a competition to make 300W TDP goliath gpus mainstream. I suppose they reached some issues on the transition to 20nm and instead of waiting, they just made behemoths instead. And that stuck. AMD just does not release sub-140W TDP gpus anymore. WTF? Why do i need a heater or an oven inside my desktop? I am not a heavy gamer and i don't need 4K60fps Raytraced gaming, but i also play some strategy games and RPGs from time to time and would like to have some gpu power, higher than a poor laptop. So where does that leave me? If i want to buy from AMD, there is nothing to satisfy me from recent gens. They are just not covering the segment from igpu to 100-120W TDP anymore. That means, people who would like something better than a badwidth bottlenecked igpu, but not something really demanding of power, size, noise, and cost. AMD just abandoned the mainstream gpu market without replacing it with gaming APUs. I just want a gpu around 80W-120W, that's all i am asking.

        Comment


        • #5
          I’m assuming this is the GPU that will be in the PS5?

          Comment


          • #6
            I can't wait for all the bits and pieces being welded together for full big Navi support to take a big freaking step ahead. Looking at the slides, big Navi should (in theory) be capbale of scaling in a nearly linear fashion towards 80CU. And if that hits with an acceptable Raytracing support, this card will be the chosen card for the next few years - even if it is a tad bit expensive.

            And with that, we have a powerful base to build "Gaming on Linux" on. First the consoles which will be have optimized games so they shouldn't be "as plagued" with gameworks shenanigans, we get Raytracing support (where I can see a lot of driver magic to happen - looking at the latest tech videos on efficient computing) and finally, its the magic bullet towards 4k @>100Hz which is the sweet spot for enthusiasts.

            Comment


            • #7
              Originally posted by skeevy420 View Post
              I just hope this generation finally has an RX 580 replacement in the $150-$250 range.
              What the hell I just bought an RX 570 6 months ago.

              Comment


              • #8
                Originally posted by atomsymbol

                It looks like it will be the same GPU architecture as PS5/XBoxSeriesX, without some of the console-specific features such as streaming&decompression of data from SSD into GPU memory.
                Then I hope it will come with HDMI 2.1 so I can play and watch movies in 4K @ 120 Hz on TV screen.
                Hopefully VRR will work too.
                Before anyone jumps to comment, no I don't care about all the latest games to be playable in 4K, I'm happy to play older games that i enjoy in 4K

                Comment


                • #9
                  Originally posted by TemplarGR View Post
                  I remember in January of 2008 i got an ATI 3870 with 512MB of GDDR4 ram. I paid around 210 euros then. Out of all the gpus i ever bought, that was my favourite purchase. Around 105W TDP, almost silent, could play every game i could throw at it at max settings @1050p. I just loved that gpu so much. Then something happened and it was like a competition to make 300W TDP goliath gpus mainstream...... AMD just abandoned the mainstream gpu market without replacing it with gaming APUs. I just want a gpu around 80W-120W, that's all i am asking.
                  Not entirely true. You just have to look a little...
                  • My now old $110 AMD HD 7750 would run around in circles of your ATI 3870 using 55W and still be running with the latest AMD cutting edge drivers (with the caveat of being GCN1...).
                  • Today, you could just order a reasonably modern $99 RX 560, which would run around in circles of my old 7750 using 80W in the process.
                  • ... or you could buy a faster GPU and under-clock it for even superior quiet performance...
                  The truth is that the lower performance tiers will often run on older technology, because developing new low performance GPUs on expensive cutting edge technology doesn't make a lot of sense.

                  Comment


                  • #10
                    Originally posted by TemplarGR View Post
                    Then something happened and it was like a competition to make 300W TDP goliath gpus mainstream. ... AMD just does not release sub-140W TDP gpus anymore. ... They are just not covering the segment from igpu to 100-120W TDP anymore. ... 80W-120W, that's all i am asking.
                    It used to be that transitioning to a denser node resulting in cost savings. That changed around the 28 nm mark. This means that it's cheaper to keep on manufacturing older GPUs on older nodes, with reduced prices, than making new GPUs on more advanced nodes which are targeted at the lower end of the market.

                    Comment

                    Working...
                    X