Announcement

Collapse
No announcement yet.

AMD Radeon RX 5700 / RX 5700XT Linux Gaming Benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • coder
    replied
    Originally posted by ic3man5 View Post
    For some reason I'm not getting the same results as the AMD RX5700 XT benchmarks are showing. I'm assuming I did something wrong?
    It's weird that your average is 59.5. That almost suggests vysnc is enabled, although your max would seem to contradict that.

    Anyway, did you run it as part of the Phoronix Test Suite, or by hand? If the latter, are you sure the settings match PTS'?

    Leave a comment:


  • ic3man5
    replied
    For some reason I'm not getting the same results as the AMD RX5700 XT benchmarks are showing. I'm assuming I did something wrong?

    Copy/Paste from the Generated HTML file:

    Unigine Heaven Benchmark 4.0

    FPS: 59.5
    Score: 1498
    Min FPS: 18.9
    Max FPS: 120.4
    System

    Platform: Linux 4.18.0-25-generic x86_64
    CPU model: Intel(R) Core(TM) i7-4770K CPU @ 3.50GHz (3499MHz) x8
    GPU model: Unknown GPU (256MB) x1
    Settings

    Render: OpenGL
    Mode: 2560x1440 8xAA fullscreen
    Preset Custom
    Quality Ultra
    Tessellation: Extreme

    I was watching
    Code:
    cat /sys/kernel/debug/dri/0/amdgpu_pm_info
    and GPU utilization was at 99% and clock speeds seems to be correct.


    I am Using Ubuntu 18.04 with the latest amdgpu driver from AMD's website.

    Leave a comment:


  • Danny3
    replied
    Originally posted by smitty3268 View Post

    Apparently OpenCL support is pretty busted in their Windows drivers right now (for Navi). I can only assume the same is true on linux.
    What?
    If it's true this is very unfortunate

    Leave a comment:


  • skeevy420
    replied
    Originally posted by ernstp View Post

    xorg amdgpu ddx of course (not modesetting)? tearfree on or off?
    Xorg and On.

    Leave a comment:


  • ernstp
    replied
    Originally posted by skeevy420 View Post
    RADV also has vsync induced mouse stutters, though it isn't as bad (but still very noticeable). Just like with AMDVLK, disabling vsync instantly fixes it. RADV having the same vsync input issues makes me wonder if it's the actual AMDGPU kernel driver.
    xorg amdgpu ddx of course (not modesetting)? tearfree on or off?

    Leave a comment:


  • skeevy420
    replied
    Originally posted by bridgman View Post

    Can you suggest a good game to demonstrate this with ?
    Hitman 2 with Proton 4.2-9.

    I'm using an up-to-date Manjaro with Mesa 19.1.1, Linux 5.1.16, a 4GB RX 580, 2x Westmere x5687, and 48GB DDR3.
    My in-game settings are DX11, Fullscreen, 1920x1080, everything set to High or On except Motion Blur (which is off because it gives me a queezy feeling).
    The free level of Hitman 2 on the Steam store is able to trigger all of these.

    Ultra Shadow Settings Vsync Bug

    Start any mission, give the shader cache a minute or two to build up, and then change the Shadows from High to Ultra. That'll trigger the weird desyncing. Changing the Shadows back to High keeps it buggy, changing Shadows to Medium or Low fixes it and allows High to be selected again; using Ultra only works when selected from the game's launcher. Disabling vsync prevents the shadow issue from happening. This does not effect RADV; only AMDVLK and Pro..at least all of the 19.XX Pro releases and every AMDVLK from Manjaro or source since May---ish (around when I bought this game).

    Input Vsync Bug

    With AMDVLK/Pro & vsync enabled it's pretty much impossible to use a mouse with this game. I've never noticed how bad mouse looking was since I've always played it with a PS4 controller. Used KB/M earlier and it was just awful. Disabled vsync and it was instantly perfect.

    RADV also has vsync induced mouse stutters, though it isn't as bad (but still very noticeable). Just like with AMDVLK, disabling vsync instantly fixes it. RADV having the same vsync input issues makes me wonder if it's the actual AMDGPU kernel driver.

    With a PS4 controller the vsync bug will cause it to act like you're holding the stick left or right until you giggle it around. It's annoying, but the game is a lot more playable than with a KB/M and it made me unsure if I was dealing with Wine, Xinput, or what. Disabling vsync fixes all the input issues.

    I haven't tried the 30fps option in the game or via framerate capping with libstrangle. Only 60fps and vsync on/off.

    EDIT: Built the 5.2 kernel and I can't trigger either of those bugs with AMDVLK-Pro or RADV. The kernel is the only thing I changed. Still effected with kernel 5.1.17.
    Last edited by skeevy420; 11 July 2019, 01:12 PM.

    Leave a comment:


  • tildearrow
    replied
    Originally posted by coder View Post
    I didn't say it's not supported, just that it's not well-supported. I should've said "widely-supported", as many consumer devices do not support it.
    Exactly. I am fine with a software decoder, however. After all I don't distribute my screen recordings.

    Originally posted by coder View Post
    This argument actually undermines your case. It sounds like you're blindly saying that "professional stuff is better - and I like better - therefore, I want it".

    The point of it being a professional format isn't because it's simply better, but rather that professionals have use cases that involve specific post-processing operations that would be hampered by low-bandwidth chroma.
    Exactly, although in consumer equipment companies are allowed to make sacrifices that won't hurt Average Joe. Examples follow:
    - Internal sound cards in motherboard often leak noise, although a very small amount that is tolerable for Average Joe. I can hear it during silence though...
    - Displays (and especially laptop displays) are often uncalibrated, therefore having poor color reproduction, but Average Joe doesn't really care about this.
    - Consumer motherboards/graphics cards don't have ECC, because they think it's OK for their computers to fail 1-5 times per year and they'll just tolerate it. Yeah, they know Average Joe will be enraged but they know he'll calm down after the storm.
    - Games that use NVIDIA RTX don't use it for the whole scene, but rather for some effects only.
    - Those "384KHz 32-bit" sound cards actually have a 21~22-bit SNR, plus Average Joe isn't aware he can only hear up to 20KHz (which would translate to a sampling rate of 40KHz). He only thinks "bigger numbers are better".

    Originally posted by coder View Post
    How was this produced? I am suspicious of the chroma decimation used. Further, there are clear DCT-style compression artifacts that suggest degradation possibly caused by quantization - not just band-limiting and interpolation.
    It was produced with an external capture card I used for a time in 2014-2015, just to demonstrate how bad it can get. x264 and the rest of hardware encoders perform better, but the desaturation is still there.

    Originally posted by coder View Post
    I'll grant you that simply using 4:4:4 is an easy way to eliminate the entire issue of the low-pass filter and interpolation quality, although you're still susceptible to stronger quantization of the chroma channels.
    Note that I do not use constant bitrate mode (CBR), but rather the secret constant quality mode (CQP/CRF). I can accept a larger file size for better quality.

    Originally posted by coder View Post
    I should add that I often use (4:2:0-sampled) JPEGs for screen grabs, since they frequently compress better than PNG, even with no observable loss in quality. I don't believe I've seen the sort of artifacts in your example, though I'm not usually looking for them.
    I see. In my case I always use PNG, because I do not really tolerate that text noise in my screenshots.

    Leave a comment:


  • bridgman
    replied
    Originally posted by smitty3268 View Post
    Perhaps I should clarify. What I meant was that no one at AMD cares about somebody whining about it on the Phoronix forums. The feature will either come or not based on whether AMD thinks it makes business sense, and I'm certain it's something that is at least on their radar. But 1 person repeatedly posting the same complaint here on every AMD thread isn't going to do anything one way or the other.
    Yeah, I guess the distinction is between "commenting" and "repeatedly posting". We don't increase the priority of something based on the number of times one person posts about it, but we do look for patterns and pain points then try to feed that into R&D plans. Driver SW is mostly Linux-specific and so it's easier to get Linux issues considered there, within the constraints of the Linux team's size. HW changes are generally more difficult to get in if they are Linux-only, although that is gradually changing over time.

    Originally posted by skeevy420 View Post
    At least one AMDVLK bug in Hitman 2 has been fixed after I posted here about it. It might be a coincidence, might not be. It was a flashbang slowdown and engine de-syncing issue I was experiencing. After I posted about it it was fixed two or three weeks later in the next AMDVLK release.
    Originally posted by skeevy420 View Post
    My only real major bug with AMDVLK now is if I change the shadow quality to whatever the shader cache wasn't generated with the game just gets weird with de-sync issues...movement lag, animations and actions don't trigger at the right time, and running around during that reminds me of being drunk in GTAV. Changing the shadow settings back to whatever they were before or restarting the game with the new shadow settings (which triggers a cache build) fixes it.
    Can you suggest a good game to demonstrate this with ?

    Leave a comment:


  • bridgman
    replied
    Originally posted by skeevy420 View Post
    While that 19.30 release says it's for Navi only, I can confirm that the Pro Vulkan from it works with my 580. I've only ran it with the official Proton 4.2 release and Hitman 2, but it was one of the smoothest Hitman 2 run's I've ever had (once the shader cache generated)...enabled VK_ICD_FILENAMES=/opt/amdgpu-pro/etc/vulkan/icd.d/amd_icd64.json globally.
    For what it's worth, I don't believe anything was done that we know would cause breakage on any other HW... the "Navi only" statement reflects the fact that QA was very Navi-focused for this release.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by smitty3268 View Post

    Perhaps I should clarify. What I meant was that no one at AMD cares about somebody whining about it on the Phoronix forums. The feature will either come or not based on whether AMD thinks it makes business sense, and I'm certain it's something that is at least on their radar. But 1 person repeatedly posting the same complaint here on every AMD thread isn't going to do anything one way or the other.
    At least one AMDVLK bug in Hitman 2 has been fixed after I posted here about it. It might be a coincidence, might not be. It was a flashbang slowdown and engine de-syncing issue I was experiencing. After I posted about it it was fixed two or three weeks later in the next AMDVLK release.

    My only real major bug with AMDVLK now is if I change the shadow quality to whatever the shader cache wasn't generated with the game just gets weird with de-sync issues...movement lag, animations and actions don't trigger at the right time, and running around during that reminds me of being drunk in GTAV. Changing the shadow settings back to whatever they were before or restarting the game with the new shadow settings (which triggers a cache build) fixes it.

    Leave a comment:

Working...
X