Announcement

Collapse
No announcement yet.

Radeon Gallium3D With Mesa 8.0: Goes Up & Down

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Pontostroy View Post
    Why lightsmark is so slow it this test? With swapbufferswait=off end pcie_gen2=1 radeon hd6770 show 130 fps in 1920x1080.
    It would be very interesting if someone could bisect and see if there's a single commit that caused this pretty incredible boost. I'd do it myself but I don't have any NI hardware.

    Comment


    • #12
      Originally posted by drag View Post
      I think that you are seeing the effect of having a lots of man hours being poured into application-specific optimizations. Lots of benchmarking over and over again and tweaking the driver to create lots of special cases and optimized code paths for popular game engines and common application developer patterns.
      I hardly doubt the catalyst driver was optimized for Nexuiz or Warsow. The optimizations you describe are done for AAA-titles like Skyrim but certainly not for some random open source game. I would even go that far and say that the current catalyst driver is no longer optimized for Doom-3 or any of the old engines somes open source games are based on.

      Because benchmarks do not tell the whole story. If I am building a computer for the sole purpose of rendering a game 4 times faster then my monitor can display it then there would be never be any point in caring past benchmarks. But the Catalyst drivers suck. Not only are they a pain to deal with and are closed source, but they are unstable, don't play the games that I use very well, the 2d performance tends to be poor... it can't even render my desktop correctly.

      If you want decent OpenGL performance with a OSS system then ATI is your best bet.
      I agree, the catalyst drivers suck. However Gallium3d/Mesa is not much better cause it lacks many of the features which are offered by modern cards (i.e. support for the hw video decoders). If you want decent OpenGL performance and full support for your graphics card on Linux, you still have to go for Nvidia cards. That's the sad truth.

      Comment


      • #13
        Originally posted by whizse View Post
        It would be very interesting if someone could bisect and see if there's a single commit that caused this pretty incredible boost. I'd do it myself but I don't have any NI hardware.
        Commit of mesa or kernel?

        Two months ago I did a couple tests,
        Code:
                    pcie_gen2=1  pcie_gen2=0
        Lightmark	116	         90
        Now with git mesa and pcie_gen2=1 i have 130 fps in 1920 and 210 fps in 1024, in the shadows tests fps increase up to 3 times, but lightmark the only application that has big performance boost.

        Comment


        • #14
          Originally posted by Temar View Post
          I hardly doubt the catalyst driver was optimized for Nexuiz or Warsow.
          I feel like I should reply to this, but since you didn't read to the end of what you quoted I guess you can do that.

          I agree, the catalyst drivers suck. However Gallium3d/Mesa is not much better cause it lacks many of the features which are offered by modern cards (i.e. support for the hw video decoders).
          It's a lot better because they work and are stable.

          If you want decent OpenGL performance and full support for your graphics card on Linux, you still have to go for Nvidia cards. That's the sad truth.
          No it's not. It may be the story you tell in order to sell yourself on shoving 5.5MB of Windows XP driver code shoehorned into your Linux kernel, but I disagree entirely. I've been using nothing but open source OpenGL drivers for several years now with good results. If you are a 'hard core gamer' or you have special applications that require the drivers features then you will want to use nvidia proprietary drivers. Otherwise they are not necessary at all.

          And if you want video decoders, especially for low-power devices you can just go out and crystalhd device...

          Comment


          • #15
            Originally posted by Pontostroy View Post
            Commit of mesa or kernel?

            Two months ago I did a couple tests,
            Code:
                        pcie_gen2=1  pcie_gen2=0
            Lightmark	116	         90
            Now with git mesa and pcie_gen2=1 i have 130 fps in 1920 and 210 fps in 1024, in the shadows tests fps increase up to 3 times, but lightmark the only application that has big performance boost.
            I was thinking of Mesa., compare 7.11.2 with 8.0.

            You didn't get the same boost in OpenArena and Nexuiz as Michael did?

            Comment


            • #16
              Originally posted by drag View Post
              I feel like I should reply to this, but since you didn't read to the end of what you quoted I guess you can do that.
              And I feel like I should reply to this, but since you didn't read the part which you didn't quote, I guess you can just do that.


              No it's not. It may be the story you tell in order to sell yourself on shoving 5.5MB of Windows XP driver code shoehorned into your Linux kernel, but I disagree entirely. I've been using nothing but open source OpenGL drivers for several years now with good results. If you are a 'hard core gamer' or you have special applications that require the drivers features then you will want to use nvidia proprietary drivers. Otherwise they are not necessary at all.
              Sorry, claiming that you had good results with the OSS OpenGL driver the last few *YEARS*, just tells me that you pretty much only use 2D. I sold my last ATI card 1.5 years ago and at that time the driver was a mess. You could not even run the 3D effects of your desktop properly. Forget about real 3D applications like for example Blender - Blender just crashed every few minutes when working with bigger models. While Blender might fall into your "special applications" category, a 3D desktop should certainly not.

              And if you want video decoders, especially for low-power devices you can just go out and crystalhd device...
              You can't plug a crystal card in every device - just think about notebooks. Besides you alreay paid for the features of you gfx-card. Why would anyone want to buy another device? Just for the sake of ideology?

              So I stand by my opinion. If you want decent performance and actually use the features of you gfx-card, you have to go with Nvidia + binary drivers. The current OSS drivers might be good enough for 3D desktops but even if you want to play games which are based on 10 year old engines, you already need a pretty decent card to get good performance with the OSS driver.

              Comment


              • #17
                Originally posted by Temar View Post
                So I stand by my opinion. If you want decent performance and actually use the features of you gfx-card, you have to go with Nvidia + binary drivers.
                You are wrong, if you "want decent performance and actually use the features of you gfx-card" you have to go with Windows. nVidia blob for Linux may be slightly better than the AMD blob for Linux, but it's still crap compared to the Windows drivers (either AMD or nVidia).

                Comment


                • #18
                  Thats incorrect. When you compare unigine heaven results then opengl should be more or less identical on both plattforms. Even when you compare dx vs opengl with heaven the diff is not that huge. I did the same tests on ati hd 5 series and THERE the difference between dx and opengl was really huge, but improved a bit after a german pc magazin (c't) ranted about it. If you say something like that do proper benchmarks. you should not compare wine games with win native when they do not use opengl however, thats unfair due to the extra emulation layer - but you can certainly compare opengl 3.x against dx10 and opengl 4.x against dx11. but unigine is a bit stupid as well, you can not use the default settings as AA/AF settings are slightly different between linux+win, be sure you test the same settings.

                  Comment


                  • #19
                    Originally posted by whizse View Post
                    It would be very interesting if someone could bisect and see if there's a single commit that caused this pretty incredible boost. I'd do it myself but I don't have any NI hardware.
                    No one is interested in doing this?

                    Comment

                    Working...
                    X