Announcement

Collapse
No announcement yet.

A Batch Of Graphics Cards On Gallium3D

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • phoronix
    started a topic A Batch Of Graphics Cards On Gallium3D

    A Batch Of Graphics Cards On Gallium3D

    Phoronix: A Batch Of Graphics Cards On Gallium3D

    Yesterday the results for the Intel Core i5 2500K graphics on Linux were finally published after receiving a new motherboard and CPU from Intel that did not encounter the earlier Sandy Bridge problems. That article included results for several ATI Radeon graphics cards using both the proprietary Catalyst driver as well as AMD's open-source Gallium3D driver, there was also the testing done from a NVIDIA GPU under the reverse-engineered Nouveau driver that's also written against the Gallium3D architecture. In this article is an even larger round up of graphics cards being tested under open-source Gallium3D drivers. There are also results from the Gallium3D-based LLVMpipe driver.

    http://www.phoronix.com/vr.php?view=15675

  • smitty3268
    replied
    Originally posted by Michael View Post
    Because it's not "out of the box" configuration... While I'm sure there's a fair number of active Phoronix members that may install it, as far as overall Linux usage goes, how many people do you think will actually go forward and do it or even know about it? Not many at all.
    While i understand and support this decision for most of the tests Phoronix does, I believe S3TC is different. Without it, apps simply will not work. Once that happens, users will ask around on forums and figure out how to get it working - or else they won't. I think it's useful to know the performance of these applications for those who do, and those who don't won't care anyway. It's not some obscure setting to speed things up, it's just whether or not you can actually get a game to run or not.

    Otherwise, you're still going to be testing the Quake 3 engine 10 years from now, and that's already basically a useless test on today's hardware. It's like glxgears, a good test to run to ensure basic functionality, but not much good for anything more than that.

    Leave a comment:


  • Kano
    replied
    @bernstein

    I could not test a Sandy Bridge cpu yet for onboard vga, but for basic games Intel gfx was usually enough. It is definitely no solution for hardcore gamers as you do not even have got OpenGL 3.x functionality (the chip has dx10.1/opengl 3.x hardware features) with Linux. Current binary only drivers (ati+nv) support OpenGL 4.x - the Unigine engine can use it already for tesselation effects (but really needs a very powerful card). But for a simple game from time to time Intel gfx should be enough.

    Much more interesting would be how the integrated media ENCODER can be used, that would speed up h264 encoding very much...

    Leave a comment:


  • oibaf
    replied
    Originally posted by whitecat View Post
    Really ? how disable S3TC on ta-spring ? On my machine with r600g the textures are corrupted !
    Spring can be run with S3TC without libtxc_dxtn:
    http://spring.bochs.info/phpbb/viewt...24720&p=461420

    However this still doesn't work with gallium drivers:
    https://bugs.freedesktop.org/show_bug.cgi?id=29012

    Leave a comment:


  • bernstein
    replied
    intel rocks @ oss

    Looking at this test and some other phoronix graphics tests it seems to me that using a discrete graphics card (other than X1xxx) with an opensource driver provides only minimal advantage over running an 2500K igp (exluding lightsmark). just stunning. or did i get the wrong impression?

    Leave a comment:


  • whitecat
    replied
    Originally posted by marek View Post
    Better compile a 32-bit glxinfo and glxgears and see if you get direct rendering.
    Good idea, I will check.
    Thank you for your answers.

    Leave a comment:


  • marek
    replied
    Originally posted by whitecat View Post
    OK, I understand. So in my case, on my system I currently have :
    - 64-bit kernel
    - both 64/32-bit libdrm packages
    - both 64/32-bit xorg-x11-drv-ati packages
    - both 64/32-bit mesa-dri-drivers, mesa-libGL and mesa-libGLU packages
    - both 64/32-bit libtxc_dxtn packages
    - a 32-bits game

    I shouldn't expect 3D problems ?
    Yes, it should be ok with that config. Better compile a 32-bit glxinfo and glxgears and see if you get direct rendering. I think you don't need 32-bit xorg-x11-drv-ati. Also libdrm is no longer required for Gallium.

    Leave a comment:


  • whitecat
    replied
    Originally posted by marek View Post
    On a 64-bit kernel, you need 32-bit Mesa (libGL + driver) and 32-bit libtxc_dxtn, which is not out-of-the-box experience either. Otherwise 32-bit 3D apps will either crash, misrender, or be slow.
    OK, I understand. So in my case, on my system I currently have :
    - 64-bit kernel
    - both 64/32-bit libdrm packages
    - both 64/32-bit xorg-x11-drv-ati packages
    - both 64/32-bit mesa-dri-drivers, mesa-libGL and mesa-libGLU packages
    - both 64/32-bit libtxc_dxtn packages
    - a 32-bits game

    I shouldn't expect 3D problems ?

    Leave a comment:


  • marek
    replied
    Originally posted by whitecat View Post
    It's a big information you give to me !
    So if I summarize, running ETQW/Doom3 on 64 bits-kernel + libGL.i686 is less optimized than running it on 32 bits-kernel + libGL.i686 ?
    On a 64-bit kernel, you need 32-bit Mesa (libGL + driver) and 32-bit libtxc_dxtn, which is not out-of-the-box experience either. Otherwise 32-bit 3D apps will either crash, misrender, or be slow.

    Leave a comment:


  • whitecat
    replied
    Originally posted by marek View Post
    The problem is you need a 32-bit driver in order to accelerate those closed 32-bit apps, i.e. to get direct rendering. If you don't have it, indirect rendering is used, which doesn't have all the features Mesa has - it's stuck at OpenGL 1.4 with a lot less extensions. Things might work but the performance will suck to say the least.
    It's a big information you give to me !
    So if I summarize, running ETQW/Doom3 on 64 bits-kernel + libGL.i686 is less optimized than running it on 32 bits-kernel + libGL.i686 ?

    Leave a comment:

Working...
X