Announcement

Collapse
No announcement yet.

DIRT 5 Now Runs On Intel Arc Graphics Under Linux With Driver Workaround

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Codemasters just needed FP64 to make the blatant rubberbanding AI even more infuriating.

    Comment


    • #12
      It seems some people don't understand how games work. Different games need different levels of precision in different aspects of the game. Render engines need different levels of precision than the physics engine.

      When you create a world, you have to specify base units. For instance, if you are Minecraft, the size of a block is the size of the terrain base unit, so when computing distances, you can do some simple math to use for position data, for instance.

      What about a game like Space Engineers, in which you can go from being on a planet that you can destroy like in Minecraft to get ores to outer space and mine on distant asteroids and other planets and moons. How big a range do you think it takes to be able to do the physics in that case?

      The terrain engine alone needs to do math with such high precision. Now, think about light rays. The thing the GPU does, right? It needs to be able to get bounce back light rays from other objects, it needs to be able to use optimization tricks to fit multiple data bits in a common format, common high performance tessellation effects, and now we are dealing with HDR lights in addition to the math to figure out where to fade in and out that mountain that is 5km away but is 12km long.

      To represent the distance in the scale of the terrain, you need high precision numbering. Now, you can write routines that do it in lower precision, but you end up doing more work than one calculation, you have to divide the world, then run a batching routine to select the next batch while deciding which part is in the line of sight, then run those batches and hope the player doesn't move too fast.

      Comment


      • #13
        Originally posted by dragorth View Post
        It seems some people don't understand how games work. Different games need different levels of precision in different aspects of the game. Render engines need different levels of precision than the physics engine.

        When you create a world, you have to specify base units. For instance, if you are Minecraft, the size of a block is the size of the terrain base unit, so when computing distances, you can do some simple math to use for position data, for instance.

        What about a game like Space Engineers, in which you can go from being on a planet that you can destroy like in Minecraft to get ores to outer space and mine on distant asteroids and other planets and moons. How big a range do you think it takes to be able to do the physics in that case?

        The terrain engine alone needs to do math with such high precision. Now, think about light rays. The thing the GPU does, right? It needs to be able to get bounce back light rays from other objects, it needs to be able to use optimization tricks to fit multiple data bits in a common format, common high performance tessellation effects, and now we are dealing with HDR lights in addition to the math to figure out where to fade in and out that mountain that is 5km away but is 12km long.

        To represent the distance in the scale of the terrain, you need high precision numbering. Now, you can write routines that do it in lower precision, but you end up doing more work than one calculation, you have to divide the world, then run a batching routine to select the next batch while deciding which part is in the line of sight, then run those batches and hope the player doesn't move too fast.
        Is it more work? Yes. Is it something the game devs should do? almost always. It's something called optimization. It's something that used to be very common in the games industry but has fallen out of vogue in recent years. It had this really neat side effect of increasing performance in the game which can lead to things like increased FPS, better frame pacing, and even more benefits.

        It was a great time when game developers would bother to do things like this. Game developers had a higher reputation, We didn't need trash like DLSS to play games that looked stunning on a moderately powerful gpu.

        Comment


        • #14
          Originally posted by Quackdoc View Post
          It was a great time when game developers would bother to do things like this. Game developers had a higher reputation, We didn't need trash like DLSS to play games that looked stunning on a moderately powerful gpu.
          DLSS is actually awesome because you can use it as a faster super sampling method. It was actually how it was initially being presented by Nvidia. They later switched it to being primarily an upscaler (but they kept the name; "Deep Learning Super Sampling."). But you can still use it for super sampling. Well, on Windows. For example on a 1080p display, you can use DSR to run the game at 4K, but use DLSS to render at 1080p and do a 4K reconstruction which DSR then downsamples back to 1080p. It looks great and is really close to what normal super sampling looks like.

          The tech itself is great. But the way it's used now is indeed worrying. It should be an option, not a requirement to get playable frame rates. The good use of DLSS is to either super sample, or to upscale to really high resolutions with high frame rates (like 4K 144FPS.) But Nvidia realized they can use it to sell underpowered cards (see RTX 4060 for example) at a higher price because you can use DLSS to work around the shitty performance of that card.

          Great tech, shitty company using it wrongly.
          Last edited by RealNC; 24 March 2024, 06:54 AM.

          Comment


          • #15
            Originally posted by RealNC View Post

            DLSS is actually awesome because you can use it as a faster super sampling method. It was actually how it was initially being presented by Nvidia. They later switched it to being primarily an upscaler (but they kept the name; "Deep Learning Super Sampling."). But you can still use it for super sampling. Well, on Windows. For example on a 1080p display, you can use DSR to run the game at 4K, but use DLSS to render at 1080p and do a 4K reconstruction which DSR then downsamples back to 1080p. It looks great and is really close to what normal super sampling looks like.

            The tech itself is great. But the way it's used now is indeed worrying. It should be an option, not a requirement to get playable frame rates. The good use of DLSS is to either super sample, or to upscale to really high resolutions with high frame rates (like 4K 144FPS.) But Nvidia realized they can use it to sell underpowered cards (see RTX 4060 for example) at a higher price because you can use DLSS to work around the shitty performance of that card.

            Great tech, shitty company using it wrongly.
            oh dont get me wrong, I like the upscaling technologies, though im not super impressed with DLSS lately XeSS was good on the single game I tried it on. My initial use for FSR was using it for super sampling myself. But yeah, game companies are indeed using it as a crutch and it's a shame. I didn't mean to say that DLSS itself is trash, but rather the act of requiring it to get decent perf at native resolutions is.

            Comment


            • #16
              Originally posted by Quackdoc View Post

              oh dont get me wrong, I like the upscaling technologies, though im not super impressed with DLSS lately XeSS was good on the single game I tried it on. My initial use for FSR was using it for super sampling myself. But yeah, game companies are indeed using it as a crutch and it's a shame. I didn't mean to say that DLSS itself is trash, but rather the act of requiring it to get decent perf at native resolutions is.
              Agreed. And it's gonna get worse. The PS5 Pro has been confirmed (from what I can tell) to use a DLSS-like upscaler. Since games in general target consoles first and are ported to PC later, upscaling as a requirement rather than an option will probably become the norm.

              As for how DLSS looks, it's not that good on 1080p displays. With 1440p and 4K though it does look really nice. XeSS is also very promising. FSR right now is kind of meh. But apparently AMD is fixing it soon. But my current use case of upscaling (super sampling or high frame rates for my 165Hz display) will unfortunately come to an end. No matter what tech you give to game studios and publishers to improve perf, they will always end up targetting 30 or 60FPS while using said tech. The only exception might be frame generation. They can't target 30FPS with frame gen, since it looks like garbage if you don't reach 90FPS+ with it.

              And to say at least something that's on-topic: Dirt 5 is kind of crap. If you want a decent Dirt game, play Dirt 4 or Dirt 3 if you can stand all the "wow bro cool that's rad bro" talk, which I personally can't

              Comment


              • #17
                Originally posted by Quackdoc View Post

                Is it more work? Yes. Is it something the game devs should do? almost always. It's something called optimization.
                The problem with lack of precision has nothing to do with optimization and can also not be worked around with more optimization.

                Starfield is another well-known example. The "open world" does not include entire planets, just some area around where the ship landed. This is because of engine design decisions to use a fixed reference point on the map as coordinate basis, instead of the player. Working with coordinates faces the more issues the further away you get from the basis point.

                More precision allows for larger world in such designs.

                See a summary of the situation here:

                Comment


                • #18
                  Originally posted by RealNC View Post

                  Agreed. And it's gonna get worse. The PS5 Pro has been confirmed (from what I can tell) to use a DLSS-like upscaler. Since games in general target consoles first and are ported to PC later, upscaling as a requirement rather than an option will probably become the norm.

                  As for how DLSS looks, it's not that good on 1080p displays. With 1440p and 4K though it does look really nice. XeSS is also very promising. FSR right now is kind of meh. But apparently AMD is fixing it soon. But my current use case of upscaling (super sampling or high frame rates for my 165Hz display) will unfortunately come to an end. No matter what tech you give to game studios and publishers to improve perf, they will always end up targetting 30 or 60FPS while using said tech. The only exception might be frame generation. They can't target 30FPS with frame gen, since it looks like garbage if you don't reach 90FPS+ with it.

                  And to say at least something that's on-topic: Dirt 5 is kind of crap. If you want a decent Dirt game, play Dirt 4 or Dirt 3 if you can stand all the "wow bro cool that's rad bro" talk, which I personally can't
                  Remeber when XeSS was going to be open source? pepperidge farm remebers xD. I personally occasionally go back and play dirt 2 myself. I have a fond spot for it.

                  Originally posted by chithanh View Post
                  The problem with lack of precision has nothing to do with optimization and can also not be worked around with more optimization.

                  Starfield is another well-known example. The "open world" does not include entire planets, just some area around where the ship landed. This is because of engine design decisions to use a fixed reference point on the map as coordinate basis, instead of the player. Working with coordinates faces the more issues the further away you get from the basis point.

                  More precision allows for larger world in such designs.

                  See a summary of the situation here:
                  https://www.rockpapershotgun.com/sta...afts-far-lands
                  Is starfield supposed to be a good example of a fp64 game? I'm not sure I follow​... Starfield still preforms like complete trash, and considering how bland a vast majority of that game is, It's certainly not a good tradeoff IMO

                  Comment


                  • #19
                    Originally posted by Quackdoc View Post

                    Remeber when XeSS was going to be open source? pepperidge farm remebers xD. I personally occasionally go back and play dirt 2 myself. I have a fond spot for it.



                    Is starfield supposed to be a good example of a fp64 game? I'm not sure I follow​... Starfield still preforms like complete trash, and considering how bland a vast majority of that game is, It's certainly not a good tradeoff IMO
                    You are confusing tech for game design there. Every Unreal Engine 4 and 5 game works this way by default, because this is how Unreal is set up. (Since each game studio does get the code, they may have rewritten this code, not an insignificant piece of work, but the reality is, most won't have.)

                    Comment

                    Working...
                    X