Announcement

Collapse
No announcement yet.

Open-Source GPU Drivers Causing Headaches In KDE 4.5

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    The good news is that most of the rearchitecture work affecting GL 2.x and GLSL is nearing completion, so by the end of the year it should be pretty safe to rely on GL 2.x and GLSL, with the caveat that a lot of hardware out there is not fully capable of supporting GL 2.x, let alone GL 3. As long as that hardware is in widespread use, the highest "safe" level of support to assume is going to be a carefully chosen subset of GLSL / GL 2.x.
    Test signature

    Comment


    • #12
      The few bug reports regarding KWin we got against our driver did not come from KDE developers, they came from regular users, so KDE devs either did not tested Mesa or they only tested working drivers or they were too lazy to file bugs.

      Arguments that something is old don't apply here. ATI started to ship a quite working GL2.1 driver around 2006 (after the GL driver rewrite inside ATI, this fact is known to the OpenGL.org community, from a former ATI employee who used to visit the forums there). Before that it was buggy and GLSL was slow and unoptimized (sometimes with software fallbacks). The first GL3 ATI driver was introduced at the beginning of 2009, so Mesa is almost 2 years behind.

      The number of active open graphics driver developers is pretty low and what is the actual number of paid developers that work on open ATI/NV drivers? 5 or so? And you expect those guys to write GL2/3/4 drivers with a full-blown hardware-specific shader compiler and optimizer for each generation of hardware capable of at least GL2 i.e. GF FX and up and ATI R300 and up? We're getting there but, with such manpower, you can't expect Unigine Heaven to work now. Making graphics drivers is hard. And making graphics drivers ultra-fast is nearly impossible unless you can afford paying 50 brave devs to work on them fulltime.

      We know that both Mesa and Gallium suck. Still we try to make it suck less every day.

      As always, patches welcome.

      Comment


      • #13
        Originally posted by marek View Post
        We know that both Mesa and Gallium suck. Still we try to make it suck less every day.
        Watch it, your getting pretty close to infringing on the Win 95 teams motto "Windows 95: It sucks less.".

        Comment


        • #14
          Originally posted by marek View Post
          And making graphics drivers ultra-fast is nearly impossible unless you can afford paying 50 brave devs to work on them fulltime.
          50 devs dedicated to optimizing would be able to accomplish a lot, but the number required to support new hardware, new GL versions, new Xorg/kernel versions *and* optimize would be a lot higher.
          Test signature

          Comment


          • #15
            Originally posted by bridgman View Post
            I don't understand the comment about "KDE developers being cautious enough" -- the fact that "OpenGL 3.0 is a few years old" (actually 2 years) doesn't mean much if most of the target hardware/driver platforms only picked up OpenGL 2.x support very recently and anything past GL 1.5 is still a bit of a work in process.

            The real question is "what hardware/driver platforms is KDE targeting ?".
            Of course that's a valid question. However, I can understand KDE devs to think that OpenGL 2.x support should be pretty much a given after all these years. Especially since OpenGL 2.0 is mostly seen as the baseline everywhere else.

            Also let me say it again: GL on open source drivers sucks.

            By the way, I'm seeing no real efforts at all to get video decode acceleration (of any sort, ASIC or shaders) and GPGPU stuff (OpenCL) to work. It's just sad...

            Comment


            • #16
              I guess I don't understand why anyone would simply "count the days" and assume that GL 2.x support should be a given ? Why not look at what is actually available ?

              GL versions have historically been introduced years after the corresponding DX standards, and because of that a lot of hardware "almost supports the standard but not quite", making driver implementation extra interesting. This has changed recently, and in the last 2 years GL standards have caught up with almost 8 years of DX revisions. That is great to see (and will make adoption of subsequent GL standards more timely and more successful) but honestly I don't think it is realistic to ask a small community of volunteers to keep pace with revisions running at 4x the pace of the PC industry.

              I'm not sure exactly what you think should be done about the situation -- your second last sentence is complaining about GL support (which is where all the developers are focused today) then your last sentence is complaining that "nobody seems to be working on decode acceleration". Where do you think developers should be focusing right now - improving GL or implementing decode acceleration ?

              Are you suggesting that developers should stop work on GL improvement so that they can work on decode acceleration instead ?
              Test signature

              Comment


              • #17
                Of course not, I just wanted to point out how the gap between closed source and open source drivers is widening even more. I know there's not much manpower left for these things (and that has been the main problem for the last few years overall).

                Comment


                • #18
                  Originally posted by brent View Post
                  Of course not, I just wanted to point out how the gap between closed source and open source drivers is widening even more.
                  I'm not sure that is actually the case, unless you start with the assumption that GL 4 is twice as good as GL 2.

                  The recent OpenGL version numbers are a bit misleading, in the sense that going from one pre-GL2 version to another was a fairly big task even though the version number only went up by 0.1. Going from GL 1.5 to GL 2.x was an even bigger task (and involved significant re-architecture work, but the jump from GL 2 to GL 3 and from GL 3 to GL 4 is smaller by comparison.
                  Test signature

                  Comment


                  • #19
                    The problem with Mesa is that 3d graphics is not a widespread expertise... I am a programmer and i would like to contribute to it, but i know nothing about it. I don't mind studying a bit, but i do not know where to start.

                    People at Mesa project should first provide some form of documentation, in order to get more help from volunteers. I tried looking at mesa3d.org but didn't find much...

                    Comment


                    • #20
                      Originally posted by Hephasteus View Post
                      Die shrinking is slowed to a crawl so 4.0 is end of the line for many many years. But still best get there as quick as possible as it will be selling cards for quite some time.
                      law: "Performance of computers is increased by 100% each three years"

                      Look it up, it has applied since the very beginning up until now. I remember people thinking the same in the P4 days.

                      Comment

                      Working...
                      X