Announcement

Collapse
No announcement yet.

AMD Radeon RX 5700 / RX 5700XT Linux Gaming Benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • digitalsin
    replied
    Michael
    Why no 1080p benchmarks? Most people in the price-range are not going to use 4k displays!
    From other online reviews, in the 1080p benches (on Windows 10) the 5700 XT outperforms the RTX 2080 in ~%80 or more of them...
    Last edited by digitalsin; 07 July 2019, 05:13 PM. Reason: Tell people ;)

    Leave a comment:


  • ms178
    replied
    By the way, I hope we will get Radeon Settings, Chill and Anti-Lag also on OpenGL/Vulkan on Linux in the future.

    Leave a comment:


  • flashmozzg
    replied
    Nice. I was worried that it'll be a disappointing flop, but it actually looks pretty promising. Not to the extent of Zen 2 wins, but still, at least there is a legit choice now. Makes me hopeful for "Big Navi" with raytracing that's rumored for the next Spring. I wonder how much of this is due to 7nm process and how NVidia numbers would look if they've just did a die-shrink.

    Leave a comment:


  • mphuZ
    replied
    Originally posted by gurv View Post
    This release just shows the mess that is AMD Vulkan support on Linux.
    Would be really good if they just gave up on AMDVLK (non pro) and instead contributed to RADV.
    Abandon cross-platform driver for "driver for 1%?" Are you crazy?

    Leave a comment:


  • Shevchen
    replied
    Thanks for the test and the results. There are still some fuzzy things going around, but the general performance region can be read.

    Now that the card is up and running, I hope the community finds ways to finally get behind the old and rusty GCN architecture and clamps in some afterburners into RDNA. (after initial OS support ofc) My 2700U APU runs RofTR at ~30 FPS on high settings, which was unbelievable a short while ago. Most of the games I'm interested in already run via Proton and even Star Citizen got a "launcher" from the community.

    Things shape up.

    Leave a comment:


  • Michael
    replied
    Originally posted by bosjc View Post
    Michael - No Total War benchmarks?
    TWW2 uses Vulkan.

    Leave a comment:


  • tildearrow
    replied
    Even worse, there is absolutely no information about the Radeon Media Engine other than it does 4K90 encode and stuff like that.

    When Vega was released, the only thing they said is "VCE 4.0 is included in the upcoming Vega graphics cards". No new features.
    When the Radeon VII came out, it was even worse, a version bump with no new features.
    But now, here comes Navi, and guess what. The page for VCN still does NOT mention VCN 2.0!!! Unbelievable!

    It has been proven for years that AMD has the worst hardware encoder block when compared to Intel or NVIDIA. It's been so many years, and they *still* have not done anything to improve:
    - VCE 1.0 immediately lost to NVENC and Quick Sync as soon as it was released. Having to crank up the bitrate to compensate was a pain.
    - VCE 2.0 brought 4:4:4, but the problem is that the guys at AMD for some reason decided to make it I-frame only, which means if you try to encode something in 4:4:4 you'll be wasting a lot of disk space... I eventually found out, that this was a feature requested by Sony for use in the PlayStation 4 for wireless displays or something like that... In the meanwhile NVIDIA not only supported 4:4:4 in I-frames form, but also P-frames, which means you could actually encode a video in 4:4:4 with your Maxwell card. Also, Intel did a smart move: they stopped bringing features to their Quick Sync chip to attempt to increase quality. AMD.......
    - VCE 3.0 brought HEVC, but for some freaking reason they decided to slow down H.264 encoding. How in the world does someone make such a foul decision?! NVIDIA has full-speed H.264 and HEVC encoding blocks, and when Intel brought HEVC encoding to Skylake, they didn't nuke the H.264 block either! What the heck?! (also, this is very ironic since HEVC is much harder to encode than H.264...)
    - VCE 4.0 seems to bring no features at all. By this time Intel supported lots of codecs, and NVIDIA had 4:4:4 encoding.
    - VCN 1.0 brings even more codecs but come on, it is APU only. Are you kidding me? Couldn't you have put VCN 1.0 on your Radeon VII?! Now due to this, we have 2 choices: high-end with old arch, and mid-high-range with new arch. No high-end with new arch option!
    - By the time VCN 2.0 was being planned, NVIDIA released Turing with more 4:4:4 goodness and a higher-quality encoder. COME ON AMD!! WHAT is hindering you from improving your darn encoder?! Do you like depend on some other company for the encoding block? Wow.
    - In the meanwhile Intel announced they'll bring 4:4:4 to Ice Lake.

    I'm pretty sure VCN 2.0 is only capable of freaking 4:2:0 crap with NO 4:4:4 whatsoever. No information about it, so what can I assume? No changes, of course.

    Screw it! I'll have to use my spare NVIDIA card for encoding and my AMD card for rendering. *sigh*
    Last edited by tildearrow; 07 July 2019, 01:21 PM.

    Leave a comment:


  • gurv
    replied
    This release just shows the mess that is AMD Vulkan support on Linux.
    Would be really good if they just gave up on AMDVLK (non pro) and instead contributed to RADV.

    Leave a comment:


  • bosjc
    replied
    Michael - No Total War benchmarks?

    Leave a comment:


  • schmidtbag
    replied
    Pretty damn impressive results for Linux on launch day. Considering how late some of the commits came in, I expected Michael to have a page worth of struggles. But instead, it actually seems to have performance comparable to Windows.
    Good job Linux team!

    Leave a comment:

Working...
X