Announcement

Collapse
No announcement yet.

AMD Radeon Linux Gaming Performance On Mesa 20.1 Looking Good With RADV+ACO

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • CochainComplex
    replied
    Found the possible explaination why 5700Xt seems to show odd behaviour.

    This patch reveals some interessting info https://gitlab.freedesktop.org/mesa/...65ef7f0c29fc44
    Now that ACO supports all shader stages (the only exception is NGG GS on Navi10 but it fallbacks to legacy GS) it makes sense to remove the LLVM version string reported as part of the device name. The LLVM version string was added in the past for some Feral games to workaround LLVM issues by detecting the version. With ACO, this is unecessary because the Mesa version is enough to eventually enable specific shader workarounds. When the LLVM version string is missing, it is assumed that an old LLVM is used and workarounds are automatically applied. The only Vulkan games that might be affected is Shadow of The Tomb Raider but the impact should be fairly small.

    Leave a comment:


  • leipero
    replied
    Originally posted by Veto View Post

    No it wasn't. You just remember it so, because that was the state of the art back then. If in doubt, you should try playing some old games again and lower the resolution and frame rate to fit.

    However, CRTs had some benefit due to hiding the low frame rates in flicker, whereas LCDs can appear more to have more judder due to the lack of flicker. Also some LCDs have ghosting. And way back, the 2D games where tightly designed and synced to the vertical refresh, so that they appeared more fluid.

    Edit: BTW the flicker effect is also used in cinema, where they hide the insanely low frame rate (~24 fps) by displaying each image twice. However, when going to a theater, I still get crazy by the amount of judder in movies e.g. when panning.
    Actually, you are wrong and he is right, sort off at least, let me explain.

    I don't know what is the cause here, but, on some PCs games appear smoother, even with same GPU and display device for example, while on others they appear less smooth, given same FPS, latency and everything. So it's perfectly possible that 240FPS doesn't appear smooth on some PC.

    My best guess to why this happens in the first place is some sort of connectivity (on the motherboard) or something related to storage devices (SATA controllers if used etc.) related to IO operations, and I have to test that further to know it for sure. What led me to that conclusion? Aside from historical events where I saw such things in person (same software, same CPU, same GPU etc., different motherboard), I've noticed that difference between VM and hardware as well, VM would appear smoother (with passed GPU) in case where some options for VHD isn't used, did not bother to actually test yet, now I'm interested to do it actually, my best guess is something among the line "aio" option, hence why I suspect that for some reason, while I/O operations are not getting starved, something seriously wrong is done to it by motherboard UEFI/BIOS or even schematics design by itself or controller. This is me speculating ofc., could be something else.

    So again, I don't know why it happens, but it does, not sure how those products actually pass quality control, because to me, such production is unacceptable. If you claim compatibility or even basic functionality, you have to meet it 100%, you do not want a wheel on your car that is 99% functional, ofc., in car case, that can cost you your life, but regardless, I would put that in poor quality control and if we are about to be real here, such products would be sold with lower price even tho manufacturer is well aware of the issues product have.
    If that is the case (and we "know" or are at least 99% sure it is), this issue is unsolvable simply because at the current (economic) system, manufacturers "cut corners" in order to be competitive, so they end up with bad design, and such products go to "lower end" and since they (manufacturers) invested some amount of money in that particular design, they do not want to lose it, so they are perfectly willing to sell non-functional products at lower price in order to meet their margins. Regulating this by law is next to impossible, multiple reasons, while on the surface solution is relatively simple = manufacturers must follow design approach of chip makers to the "last screw", obviously, that removes "diversity" between products, added "features" and all sorts of nonsense you can see in all sorts of product lines. Regulated by law, it does carry a risk for companies who do such things to go out of business, 1 bad design = not passing quality control, 2, 3, 4 = bankruptcy. To put aside potential for corruption and companies paying "right people" to pass quality control with bad product.

    So yeah, this issue is next to impossible to solve in any way as of now, and aside from all negative effects it have towards consumers, it actually harms environment as well, because most of these products end up in trash much sooner that they need to, contributing towards e-waste issue, and in wider picture, materials are used for their production when they could have been used in much more sensible way. I'm ranting at this point sorry lol.

    Leave a comment:


  • Linuxxx
    replied
    Originally posted by Dr. Righteous View Post
    Call me late to the party but I just recently did the 100% plunge for linux gaming. I maintained a dual boot PC and mainly gamed on Win7. Last time I went to update my video card drivers it ran into some hiccups because I used Win7. That was it; I was done.
    Latest LTS for Ubuntu just hit and as my habit was I installed Ubuntu Studio on the PC making Win7 a thing of the past. So I installed Steam and SteamPlay and my great delight many of the games I currently played from time to time ran great. Just as well as they did on Windows. But still some not so much.
    On thing that occurred to me was that Ubuntu Studio uses a Low Latency kernel. I'm wondering if that will effect how these games run.
    The low-latency Linux kernel choice is the only sensible one for general desktop usage, and especially more so for gaming purposes!

    Here's a benchmark that consistently showed better minimum frame-rates when configuring Linux to utilize 1000 Hz timer ticks, which Ubuntu's low-latency kernel does:

    The Linux kernel's CONFIG_HZ option can modify the balance between system throughput and latency. In this article, we explore its effects on KVM.


    And here's the relevant quote:
    Overall, 1000Hz nets better minimum framerates.
    ...
    It is advisable to pick 1000Hz over 100Hz or 250Hz in order to receive a small but tangible minimum framerate improvement while gaming.

    Leave a comment:


  • Dr. Righteous
    replied
    Call me late to the party but I just recently did the 100% plunge for linux gaming. I maintained a dual boot PC and mainly gamed on Win7. Last time I went to update my video card drivers it ran into some hiccups because I used Win7. That was it; I was done.
    Latest LTS for Ubuntu just hit and as my habit was I installed Ubuntu Studio on the PC making Win7 a thing of the past. So I installed Steam and SteamPlay and my great delight many of the games I currently played from time to time ran great. Just as well as they did on Windows. But still some not so much.
    On thing that occurred to me was that Ubuntu Studio uses a Low Latency kernel. I'm wondering if that will effect how these games run.

    Leave a comment:


  • CochainComplex
    replied
    Originally posted by nranger View Post

    Indeed, humans are incredibly analog. Our perceptive abilities vary greatly depending on movement, brightness, contrast, kinesthetics, etc.

    As you mention, input lag is a huge deal. Another one is pixel persistence. Too many monitors today advertise 144, 165, or 240Hz, but they have pixels that can only switch certain colors or brightness levels at the equivalent of 100 or 120Hz. LCDs are far superior to CRTs for sharpness, resolution, and contrast, but only in the last few years have they matched CRT technology when it comes to motion clarity.
    I remember in the early 2000 my father got me an "old" decommissoned SGI CAD 21" CRT from his department at work. 1280x1024 80Hz ....Counterstrike with around 80Fps ..that was smooth on that thing.

    Leave a comment:


  • CochainComplex
    replied
    Originally posted by TemplarGR View Post

    If 240 fps aren't smooth enough for you, you don't have lizard eyes, you have robotic eyes....

    People don't notice the difference much in such high framerates. What people say when they talk about smoothness is the input lag. Low framerates make input lag more noticeable, since it takes longer for the result of your input to show to the monitor. While a high framerate means that any micro-move you make will instantly appear on the screen.

    So it is not really about the eyes, it is about the hands.
    You are not one of us...only Lizard People understand. Ask Hillary Clinton she will know what I'm talking about.

    Fun aside. 144Hz are nice but I doubt that as you already wrote that Im able to recognize any change above. Only inputlag as you wrote.

    Leave a comment:


  • theriddick
    replied
    Allot of console gamers are VERY use to playing at 30fps. There are tricks MS/Sony do to make games seem decent at 30fps/hz that generally aren't used for PC games.

    I can't wait for 4k@100-144hz monitors to become the normal thing. You can already get a few atm but the price can be pretty damn steep (and we will need 3080TI to even capitalize on them)

    Leave a comment:


  • geearf
    replied
    Thanks Valve for this!

    Leave a comment:


  • pal666
    replied
    Originally posted by Veto View Post
    However, CRTs had some benefit due to hiding the low frame rates in flicker, whereas LCDs can appear more to have more judder due to the lack of flicker. Also some LCDs have ghosting. And way back, the 2D games where tightly designed and synced to the vertical refresh, so that they appeared more fluid.
    in additional to all of this lcds usually spend considerable time processing data, while plain old crt was shooting pixels immediately

    Leave a comment:


  • pal666
    replied
    Originally posted by Raka555 View Post
    What is interesting to me is that "back in the days" we played with 25-30 fps and it was buttery smooth.
    Not sure how it was achieved
    it was achieved by comparing your experience against 5-10 fps

    Leave a comment:

Working...
X