Announcement

Collapse
No announcement yet.

AMD Catalyst Gaming Performance For BioShock Infinite On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by ChemicalBrother View Post
    I'm obviously not Michael, but this comparison is not possible, as Bioshock Infinite needs OpenGL 4.2 and Mesa only provides 3.3.
    Does it actually use any GL4.2 or GLSL4.2 functionality missing from Mesa or can you make it run with version overrides?

    Comment


    • #12
      Originally posted by AndyChow View Post
      CPU bottleneck on a i7-5960x? Crappy port is more like it.
      If it's not properly multithreaded you can even bottleneck an i7 Extreme. The Witcher 2 is also extremely bottlenecked on my i7 @ 4.7 GHz. It doesn't matter if I run it at Ultra @ 1440p or set everything to low the framerate is about the same.

      Comment


      • #13
        Originally posted by peppercats View Post
        2gb 270x performing almost equal to 4gb 290 at 2560x1600 -> extremely obvious CPU bottleneck somewhere

        ffs release your linux ports as betas first so they don't get such shitty reviews. How hard would it be to pay someone who works on mesa to optimize your game? The people they're currently paying are obviously clueless.
        Hiring Mesa developers won't solve the problem as the problem isn't actually with how they are using Mesa. They might very well be using Mesa in a very efficient way and still see that kind of performance setback. The main problem is with mapping a game that is designed to run on DirectX and make the exact same code run against OpenGL with no OpenGL specific optimizations.

        Comment


        • #14
          Originally posted by sarmad View Post
          Hiring Mesa developers won't solve the problem as the problem isn't actually with how they are using Mesa. They might very well be using Mesa in a very efficient way and still see that kind of performance setback. The main problem is with mapping a game that is designed to run on DirectX and make the exact same code run against OpenGL with no OpenGL specific optimizations.
          they aren't using mesa on all, because it doesn't run on mesa.

          this entire situation reminds me of when console games got extremely bad windows ports then pointed to their bad sales as PC gaming dying, instead of failing to realize how shitty their ports were.

          Comment


          • #15
            So, Michael, did you miss the ARB_texture_compression_rgtc bug on pre-GCN hardware (which causes characters to not have any textures) or is there some even newer Ubuntu 15.04 driver no one ripped yet? (I'm using the one in AUR under the name of Catalyst-test and do have the bug on the 6850. Characters have no textures at all. Everyone else with similar hardware reports this in the game hub.)

            Until the game renders fine, the benchmark makes no sense, as not everything is rendered.

            Comment


            • #16
              Originally posted by eydee View Post
              So, Michael, did you miss the ARB_texture_compression_rgtc bug on pre-GCN hardware (which causes characters to not have any textures) or is there some even newer Ubuntu 15.04 driver no one ripped yet? (I'm using the one in AUR under the name of Catalyst-test and do have the bug on the 6850. Characters have no textures at all. Everyone else with similar hardware reports this in the game hub.)

              Until the game renders fine, the benchmark makes no sense, as not everything is rendered.
              In what I saw of HD 6000 series's tests, there weren't any real render differences noticeable in the benchmark mode.
              Michael Larabel
              https://www.michaellarabel.com/

              Comment


              • #17
                Its a good performance considering it use an wrapper, but nice performance compared to what? exactly, would be good to see how is it compared to the windows version. Its a shame the radeon free driver cant be used.

                Comment


                • #18
                  Originally posted by Michael View Post
                  In what I saw of HD 6000 series's tests, there weren't any real render differences noticeable in the benchmark mode.
                  There are. People look like wax statues, part of the vegetation is missing, as well as something (mud? sunshine reflection?) on the pavement is black. If you re-run a quick test, these can be seen in the benchmark. I don't know how much it affects performance. We won't see it before the summer I guess.

                  It is also confirmed by the VP developer, this is why officially only GCN cards are supported at the moment.

                  Comment


                  • #19
                    Originally posted by eydee View Post
                    There are. People look like wax statues, part of the vegetation is missing, as well as something (mud? sunshine reflection?) on the pavement is black. If you re-run a quick test, these can be seen in the benchmark. I don't know how much it affects performance. We won't see it before the summer I guess.

                    It is also confirmed by the VP developer, this is why officially only GCN cards are supported at the moment.
                    Yeah, I see this too.

                    Perhaps it's fixed with Catalyst 15.3 beta and thats why Michael didn't see the missing textures?

                    Michael have you tested the game with 14.12 to see if you get missing textures with your Radeon 6000 series GPU?

                    Comment


                    • #20
                      For what it?s worth, the game runs great (>70 FPS in 1080p/High) on my GTX 660 and is far from being CPU-limited on my i5.

                      The game?s starting script does set __GL_THREADED_OPTIMIZATIONS though, which can help on nvidia hardware but not on AMD GPUs?

                      Comment

                      Working...
                      X