Announcement

Collapse
No announcement yet.

ATI's Gallium3D Driver Is Still Playing Catch-Up

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • d4ddi0
    replied
    blender

    Originally posted by MostAwesomeDude View Post
    I do Blender. I'm kind of fail at it, but I can do some simple stuff.

    The reason that we avoided stipples is a combination of not many apps using it and the fact that we don't have it working in HW yet. I'll put more effort towards it and see if we can get something working soon.

    As far as large vert counts go, we do just fine on that front; if you've got a real-world problem there, let us know.
    Since Blender supports python extensions, some ambitious coder could begin to write some interesting synthetic benchmarks using blender...

    You might get as much information about the strengths and weaknesses of blenders data handling and rendering, but that would be kind of win-win too.

    I wonder what that might look like.

    Leave a comment:


  • frej
    replied
    Originally posted by phoronix View Post
    Phoronix: ATI's Gallium3D Driver Is Still Playing Catch-Up

    Yesterday we delivered benchmarks showing how the open-source ATI Radeon graphics driver stack in Ubuntu 10.04 is comparing to older releases of the proprietary ATI Catalyst Linux driver. Sadly, the latest open-source ATI driver still is no match even for a two or four-year-old proprietary driver from ATI/AMD, but that is with the classic Mesa DRI driver. To yesterday's results we have now added in our results from ATI's Gallium3D (R300g) driver using a Mesa 7.9-devel Git snapshot from yesterday to see how this runs against the older Catalyst drivers.

    http://www.phoronix.com/vr.php?view=14757
    About the warsow benchmark

    Are you sure the game is run the exact same way? The newer fglrx has more opengl support ..2.0 vs.. 1.X?
    Warsow might enable more gfx features because the driver exports opengl 2.0. Or even use a completely different render pipe that requires opengl 2.0.

    This would ofcourse skew benchmarking.

    Leave a comment:


  • MrCooper
    replied
    Originally posted by MostAwesomeDude View Post
    The reason that we avoided stipples is a combination of not many apps using it and the fact that we don't have it working in HW yet. I'll put more effort towards it and see if we can get something working soon.
    Meanwhile, the draw module would offer at least an interim solution. Same for some other less popular OpenGL features, it should be possible to achieve correctness without any software rasterization (see e.g. the svga driver).

    Leave a comment:


  • chrisr
    replied
    I haven't noticed any improvement with ColorTiling enabled

    Originally posted by airlied View Post
    I think the final speeds up in order of when they'll get done look like won't be driver code optimisations as much as gpu feature usage:

    1. Color tiling - need to enable by default and fix regressions
    I am running F12 with KMS, the xorg-x11-ati driver and Mesa from git, and have just enabled ColorTiling. I can't say that it has made any difference (at all) to celestia. Celestia still feels "speedy" under r300c, and sluggish under r300g. And I have made sure that I'm using the "Open GL vertex program" rendering path in both cases.

    This is with my RV350 (Radeon 9550).

    Leave a comment:


  • EvilTwin
    replied
    thanks
    again one less thing i dont know

    Leave a comment:


  • glisse
    replied
    Originally posted by EvilTwin View Post
    sorry if this is kind of a stupid question
    the mesa/gallium3d drivers is trying to do the same as radeon, radeonhd and so on?
    if yes: why are there so many projekts trying to do pretty much the same?
    and whats the difference between the mesa/gallium3d stuff and radeon(hd)
    DDX driver radeon | radeonhd
    OpenGL driver radeon classic or gallium

    DDX driver only accelerate X11 rendering it doesn't provide GL acceleration of any kind. The OpenGL driver provide OpenGL acceleration on top of a DDX driver, thought with gallium a gallium driver can also replace the DDX.

    So no mesa/gallium aren't duplicate effort.

    Leave a comment:


  • EvilTwin
    replied
    sorry if this is kind of a stupid question
    the mesa/gallium3d drivers is trying to do the same as radeon, radeonhd and so on?
    if yes: why are there so many projekts trying to do pretty much the same?
    and whats the difference between the mesa/gallium3d stuff and radeon(hd)

    Leave a comment:


  • Killigrew
    replied
    When using Gallium 3D Compiz crashes when i resize windows heavily.

    greetings

    Leave a comment:


  • smitty3268
    replied
    Originally posted by bridgman View Post
    The back end of LLVM didn't seem to be a good fit for architectures where a single instruction word included multiple instructions, which is the case for 3xx-5xx (vector + scalar) and for 6xx+ (up to 5 scalar instructions).

    As MostAwesomeDude said, the existing shader compiler in the 300 and 300g driver seems pretty good - might not be as good as the one in fglrx but my guess is that it's pretty close.

    Right now there are no signs that the shader compiler is the bottleneck. The performance graphs indicate that the driver is CPU limited quite a bit of the time, and as airlied said there are also some GPU optimizations still to be done.

    LLVM may turn out to be more useful for single-instruction architectures like the Intel and NVidia GPUs, not sure.
    From what I've heard, a real compiler with serious optimization work will be required for decent performance for more modern hardware that can run much more complex shaders. That may not be the case for r500, which is relatively limited. If no one steps up to do the necessary work for VLIW backend support in LLVM, then there's also been talk of just sending gallium TGSI -> LLVM -> TGSI -> and on to the radeon compiler. That would allow serious optimizations to be run on the gallium "bytecode" before returning it for the normal driver to send to the VLIW hardware.

    Leave a comment:


  • MostAwesomeDude
    replied
    Originally posted by bugmenot View Post
    Yes, it's needed by programs like Blender, that uses stippled lines to show different types of things in the 3D view. I know there are software fallbacks in r300classic, and when disabled, the interface is unable to provide full information to the user, it really matters. And the same happens in other CAD programs.

    Please, consider games and a handful of minor apps aren't what drivers should only work with, try "serious work" apps too like 3D apps, CAD and similar things. My impression (or should I say experience?) is that games go just for candy, and work apps are the ones that really show the quality of drivers and cards, as they cover all fields (Blender has GLSL, eg), and can push a computer to the limit (they will not whine if your model has 1M polys... your computer and card will first). They also do other things like select operations that hardly or never happen in games. Etc, etc...

    If you don't know how to use those programs, ask their communities, we will gladly help get the drivers at usable level. Thanks for the work up to now, in any case.
    I do Blender. I'm kind of fail at it, but I can do some simple stuff.

    The reason that we avoided stipples is a combination of not many apps using it and the fact that we don't have it working in HW yet. I'll put more effort towards it and see if we can get something working soon.

    As far as large vert counts go, we do just fine on that front; if you've got a real-world problem there, let us know.

    Leave a comment:

Working...
X