Announcement

Collapse
No announcement yet.

DXVK 1.9.3 Released With NVIDIA DLSS Integration, Many Game Fixes

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by xPakrikx View Post
    I am little-bit lost right now so its dxvk-native ?
    DXVK is a translation layer which turns DirectX 9/10/11 into Vulkan at run-time. It is mostly useful to run DirectX 11 apps on Wine.
    DXVK-Native is the same thing as DXVK, but at compile-time. It is mostly useful to help you port your DirectX app to Vulkan.

    Comment


    • #12
      Originally posted by tildearrow View Post

      It links to DXVK upstream. What happened?
      It didn't this morning.

      Comment


      • #13
        ssokolow
        Senior Member
        ssokolow
        From my own testing, DXVK have about 10-28% overhead in comparison to gallium nine in some cases, but it really depends on shader model used (will explain). Quick check on Steam, and it seems that game actually uses DX9, so if you are not stuck with nvidia, I would try gallium nine.

        The fsync/esync have almost no impact on actual performance and overhead from what I saw, fsyinc (with FUTEX2 futex_waitv) seems to improve overall "smoothness", but in general, very little to no impact on actual performance numbers.

        To elaborate on shader model, while testing, I've observed that general texture fillrate is pretty much on pair with DXVK vs gallium nine, the big difference happens with vertex and pixel shader, where gallium nine is quite a bit faster, and to make things even more interesting, it really depends what type of shaders are used. For example, 3DMark03 that have both SM2.0 and 3.0 generally loses 10-28%, while 3DMark06 actually gains performance with DXVK (about 14%) vs gallium nine.
        It really depends on the scenario, for example in 3DMark03, the difference of 11% in favor of gallium nine comes at 1080p, the gain of 28% comes at 800x600.

        Some games do have some issues with gallium nine, and actually lose a lot of performance or can't even run, so DXVK is better in terms of compatibility.

        Comment


        • #14
          leipero
          Senior Member
          leipero

          From my own experience, Gallium-Nine is not worth it because of its buggy nature, which makes it unreliable.

          Especially so for Unreal Engine 3 based games, which "A Hat in Time" is.

          Also note that the performance governor helps DXVK immensely, which I'm not sure whether you used it during your benchmarks.

          ssokolow
          Senior Member
          ssokolow

          Have you ever thought about getting a second-hand workstation-class PC with an Intel Haswell CPU?

          They are dirt cheap & would actually multiply your CPU performance when compared to your rather ancient Athlon II...

          Comment


          • #15
            Linuxxx
            Senior Member
            Linuxxx
            Yes, gallium nine is far less compatible than DXVK, it is still however worth a shot, I don't know how it acts in UE3 engine games to be fair.

            Maybe in some cases, I personally didn't see any significant difference between performance and schedutil, so I don't really bother with governors, Schedutil is just fine from my experience.
            You can actually replicate my findings easilly, install 3DMark03, deselect CPU tests (saving time, makes no difference, if anything, gives advantage to DXVK) and leave the rest at default, change resolution (lower = better to see the actual overhead) and run benchmark. Do the same for DXVK.

            Comment


            • #16
              Originally posted by leipero View Post
              ssokolow
              Senior Member
              ssokolow
              From my own testing, DXVK have about 10-28% overhead in comparison to gallium nine in some cases
              Sounds about right. I've got a 3.2GHz dual-core chip and the minimum system requirements list a 3.0GHz dual-core chip.

              Originally posted by leipero View Post
              Quick check on Steam, and it seems that game actually uses DX9, so if you are not stuck with nvidia, I would try gallium nine.
              I'm on a GeForce GTX750 that I've had since the fglrx days... which reminds me, I need to re-research good low-profile angled SATA cables to have on standby so I can fit in the GTX760 my brother upgraded off of if my GTX750 dies before I'm ready to upgrade.

              Originally posted by Linuxxx View Post
              Have you ever thought about getting a second-hand workstation-class PC with an Intel Haswell CPU?

              They are dirt cheap & would actually multiply your CPU performance when compared to your rather ancient Athlon II...
              1. I'm guessing they run at higher than 65W TDP? I don't have air conditioning and need something that can run cool.
              2. Given that this has to run 24/7 maybe six feet from my pillow, I'd probably have to transplant it into my current case and buy it a Noctua CPU cooler, and I'm not sure I can justify that over just saving up for a new Zen 3 (or 4, if I can wait) machine once the chip shortage ends.
              3. Aside from A Hat In Time and occasional grumbling about YouTube being too much of a hog when multiply middle-clicked, I'm pretty comfortable with what I have. I'm not a huge gamer and normally stick to stuff old enough or indie enough to be perfectly fine with my current system.
              4. I live in middle-of-nowhere, Ontario, Canada, and it'd be time-consuming and/or expensive to drive to somewhere with good enterprise-class electronics recyclers (eg. Markham, so I hear) or pay to have something so bulky shipped.
              I suppose points 1 and 2 could be remedied by using it only for A Hat In Time and keeping it turned off when not in use, but that'd be a big hassle, fan noises really bother me, and I don't really feel spending $50+ on a pair of those construction site hearing protectors with built-in headphones and aux jack, just so I can be annoyed by having my head squeezed instead. I dump and emulate all my non-PC games partly for the convenience of having everything right here.
              ssokolow
              Senior Member
              Last edited by ssokolow; 12 January 2022, 08:31 PM.

              Comment


              • #17
                Originally posted by leipero View Post
                From my own testing, DXVK have about 10-28% overhead in comparison to gallium nine in some cases, but it really depends on shader model used (will explain).
                they use different shader compilers atm, you'll have to recheck after radeonsi is ported to aco

                Comment


                • #18
                  Originally posted by pal666 View Post
                  they use different shader compilers atm, you'll have to recheck after radeonsi is ported to aco
                  I will, are there any plans on doing that? Is it WIP?

                  Comment

                  Working...
                  X