Announcement

Collapse
No announcement yet.

Intel's Linux Sandy Bridge Graphics Still Troubling

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Works For Me

    Just upgraded to Sandy Bridge here and it's all working great for the most part... only trouble I'm having is that GRUB doesn't seem to like loading my kernel from my new 2 TB WD drive. But once it's booted, 2.6.27 works just fine. The integrated graphics in my i5-2400 blow my last desktop's Radeon away, even with desktop effects at the maximum. It isn't perfectly stable (one X server crash so far, in my first day), but given the maturity of this GPU/CPU, I can't say I'm terribly surprised. Overall, I'm very impressed.

    Note: I did try 2.6.28-rc1, and had a terrible time. Starting KDE was basically impossible without the system locking up (no panic message or anything; kernel didn't even sync the hard drives)

    My system setup:
    Gentoo GNU/Linux; x86 (32-bit) userland (hey, I value my memory more than CPU time!), with 64-bit kernel (to actually use all 8 GB RAM)
    Stable x86 tree, with exceptions for:
    =sys-kernel/gentoo-sources-2.6.37*
    =x11-drivers/xf86-video-intel-2.14*
    =media-libs/mesa-7.10*
    =x11-libs/libdrm-2.4.23* (required by the intel 2.14 driver)

    Comment


    • #22
      Originally posted by jbarnes View Post
      Some of our developers are pretty convinced we can do a higher performance driver without Gallium...
      Hehe, always that fine line between pure performance and development speed. Gotta love it.

      Comment


      • #23
        Originally posted by jbarnes View Post
        Some of our developers are pretty convinced we can do a higher performance driver without Gallium, but of course it's hard to compare unless you have good classic and Gallium drivers for the same chip...
        Fine. I don't really care whether it's Gallium or not. But this seems swimming against the current, and it sounds like a lot of work to implement non-Gallium OpenCL, DirectX, etc...

        Comment


        • #24
          Originally posted by Luke-Jr View Post
          Fine. I don't really care whether it's Gallium or not. But this seems swimming against the current, and it sounds like a lot of work to implement non-Gallium OpenCL, DirectX, etc...
          Moving to the Gallium3D model seems to be a Good Thing for new drivers, or for existing drivers when there is a significant amount of work remaining to do (as was the case for the r300 and r600 drivers), but it seems to me that the benefits are more related to a cleaner model for the developers to work on rather than an inherently faster API.

          The questions about OpenCL etc.. are fair ones, but it seems to me that the new GLSL compiler work probably benefited Intel users more than moving their existing driver to Gallium3D would have, and it helped to improve the common code as well.

          There are still open questions about the ideal IR to use between core mesa and the HW drivers (TGSI vs Mesa IR vs LLVM IR vs GLSL IR) and IMO getting consensus on that is higher priority than moving existing drivers to Gallium3D.
          Test signature

          Comment


          • #25
            Originally posted by bridgman View Post
            There are still open questions about the ideal IR to use between core mesa and the HW drivers (TGSI vs Mesa IR vs LLVM IR vs GLSL IR) and IMO getting consensus on that is higher priority than moving existing drivers to Gallium3D.
            But using a single driver model (ie gallium) wouldn't help more with the optimization of drivers??? I mean resources are from what i understand the biggest problem and it isn't exactly efficient implementing features in both classic and gallium.

            come on intel

            Comment


            • #26
              Last time I looked the Mesa stack was still designed around the classic hw driver model and Gallium3d drivers were "wrapped" by code that converted classic mesa calls to Gallium3d calls and converted mesa IR to TGSI. I don't think there is much duplication happening here yet.

              The classic hw driver API can't go away as long as the stack is supporting pre-dx9 hardware.
              Test signature

              Comment


              • #27
                Originally posted by bridgman View Post
                The classic hw driver API can't go away as long as the stack is supporting pre-dx9 hardware.
                Why not? All the drivers for the older hardware can be ported.

                Comment


                • #28
                  Originally posted by Luke-Jr View Post
                  Why not? All the drivers for the older hardware can be ported.
                  How would you port them ? Most of the older hardware is fixed function (no shaders).

                  The Gallium3D API is shader-only, no fixed function support. Calls to the API make use of GPU functionality which is not present on the older hardware.

                  The classic Mesa HW driver API, on the other hand, was designed around fixed function hardware and supports it natively.

                  There was some discussion about faking support for fixed function hardware in Gallium3D by recognizing shader programs which corresponded to specific fixed-function operations then setting the fixed-function hardware accordingly... but nobody is really excited about that approach.
                  Test signature

                  Comment


                  • #29
                    Originally posted by bridgman View Post
                    The Gallium3D API is shader-only, no fixed function support. Calls to the API make use of GPU functionality which is not present on the older hardware.

                    The classic Mesa HW driver API, on the other hand, was designed around fixed function hardware and supports it natively.
                    Honestly, I have to admit I don't have the foggiest idea what shaders are or really anything about OpenGL's technical side. Are shaders the reason almost nothing works with software rendering anymore? If so, what good are they?

                    I thought Gallium3D was more of code reuse, not removing support for older functionality. :/

                    Comment


                    • #30
                      Originally posted by Luke-Jr View Post
                      Honestly, I have to admit I don't have the foggiest idea what shaders are or really anything about OpenGL's technical side.
                      No problem. Early OpenGL implementations used a "fixed function" graphics pipeline, where the application programmer specified things like viewing position, light sources, surface colours/textures and fog directly. Hardware worked the same way and writing drivers was conceptually easy - the GPUs had registers that corresponded to most of the OpenGL settings, so many OpenGL calls could be turned into "obvious" register writes.

                      Shaders are basically programs that execute on the GPU and replace the fixed function hardware. Rather than saying "here is the viewing postion, lights go there, there and there" the application uses a vertex shader program which performs transformation, lighting, and a bunch of new effects which couldn't be done with fixed function hardware. Same thing on the pixel processing side - rather than having fixed functions for things like texturing and fog, a program is run on the GPU for every pixel generated from every triangle, and the output of that program is the colour value for that pixel.

                      Originally posted by Luke-Jr View Post
                      Are shaders the reason almost nothing works with software rendering anymore? If so, what good are they?
                      Pixel and vertex shaders allow application programmers to implement more complex and eye-catching effects, but they do require the GPU to do more work and so software rendering becomes less and less practical. That said, the llvmpipe renderer does a surprisingly good job of executing complex shader programs on a CPU.

                      As long as you have a suitable GPU, shaders are a great way for application programs to implement new and wonderful effects. If you don't have a suitable GPU, you might wish the application developer had stuck with simpler effects that could run faster. That doesn't make shaders bad, they're just a tool to let you do neat visual effects and shift more work to the GPU.

                      Originally posted by Luke-Jr View Post
                      I thought Gallium3D was more of code reuse, not removing support for older functionality.
                      It's an API designed around newer GPUs, which allows a single driver to be used for more interesting things (things like compute, which fixed function hardware could never do) but the cost is that support for older GPUs without shaders is difficult-to-impractical.
                      Test signature

                      Comment

                      Working...
                      X