No announcement yet.

A Batch Of Graphics Cards On Gallium3D

  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by whitecat View Post
    OK, I understand. So in my case, on my system I currently have :
    - 64-bit kernel
    - both 64/32-bit libdrm packages
    - both 64/32-bit xorg-x11-drv-ati packages
    - both 64/32-bit mesa-dri-drivers, mesa-libGL and mesa-libGLU packages
    - both 64/32-bit libtxc_dxtn packages
    - a 32-bits game

    I shouldn't expect 3D problems ?
    Yes, it should be ok with that config. Better compile a 32-bit glxinfo and glxgears and see if you get direct rendering. I think you don't need 32-bit xorg-x11-drv-ati. Also libdrm is no longer required for Gallium.


    • #32
      Originally posted by marek View Post
      Better compile a 32-bit glxinfo and glxgears and see if you get direct rendering.
      Good idea, I will check.
      Thank you for your answers.


      • #33
        intel rocks @ oss

        Looking at this test and some other phoronix graphics tests it seems to me that using a discrete graphics card (other than X1xxx) with an opensource driver provides only minimal advantage over running an 2500K igp (exluding lightsmark). just stunning. or did i get the wrong impression?


        • #34
          Originally posted by whitecat View Post
          Really ? how disable S3TC on ta-spring ? On my machine with r600g the textures are corrupted !
          Spring can be run with S3TC without libtxc_dxtn:

          However this still doesn't work with gallium drivers:


          • #35

            I could not test a Sandy Bridge cpu yet for onboard vga, but for basic games Intel gfx was usually enough. It is definitely no solution for hardcore gamers as you do not even have got OpenGL 3.x functionality (the chip has dx10.1/opengl 3.x hardware features) with Linux. Current binary only drivers (ati+nv) support OpenGL 4.x - the Unigine engine can use it already for tesselation effects (but really needs a very powerful card). But for a simple game from time to time Intel gfx should be enough.

            Much more interesting would be how the integrated media ENCODER can be used, that would speed up h264 encoding very much...


            • #36

              Originally posted by Michael View Post
              Because it's not "out of the box" configuration... While I'm sure there's a fair number of active Phoronix members that may install it, as far as overall Linux usage goes, how many people do you think will actually go forward and do it or even know about it? Not many at all.
              While i understand and support this decision for most of the tests Phoronix does, I believe S3TC is different. Without it, apps simply will not work. Once that happens, users will ask around on forums and figure out how to get it working - or else they won't. I think it's useful to know the performance of these applications for those who do, and those who don't won't care anyway. It's not some obscure setting to speed things up, it's just whether or not you can actually get a game to run or not.

              Otherwise, you're still going to be testing the Quake 3 engine 10 years from now, and that's already basically a useless test on today's hardware. It's like glxgears, a good test to run to ensure basic functionality, but not much good for anything more than that.