Announcement

Collapse
No announcement yet.

Open-Source GPU Drivers Causing Headaches In KDE 4.5

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Veerappan View Post
    I'll be committing my project code directly to the WebM project git repository (possibly a branch of the public repository during development). I have a feeling that Michael will probably pick up a story on this when it's complete.

    For now, you can start looking at the VP8 decoder interface source:
    http://review.webmproject.org/gitweb...c2ad3d;hb=HEAD

    I'd start at vp8_dx_iface.c and go from there. It's not the most well-commented code (massive understatement?), but I might spend some time cleaning that up as well while I'm learning the code-base.
    Well what I need is a mpeg 2 bitstream parser. I was making one, when I realised a much better way of making it by looking at mplayer's code.

    But starting from scratch with the bitstream parser is just too time consuming for me. I have a personal git repo, with galium/mesa code able to running vdpauinfo.

    Comment


    • Originally posted by bridgman View Post
      The awkward reality is that the "KDE situation" is the worst possible case in many respects, in the sense that KDE developers are starting to make use of features which are also under development in the open drivers.
      Features which have been a part of the OpenGL spec for -years-. They're no longer in active development for the blobs, they're already past QA and several rounds of bugfixing. The problematic Blur effect only uses OpenGL 2.x and GLSL.

      Originally posted by bridgman View Post
      ...but since they are using features which were exposed very recently, which are working well with many applications, but which still being tested and fixed on others there is bound to be a big heap of pain unless the two groups coordinate their efforts.
      It isn't KDE's responsibility to fix problems with Open Drivers. I can't really think that splitting KDE.org's resources between desktop development and driver bugfixes as worthwhile or productive. All you do is slow the development of KDE in that case.

      This isn't super-recent OpenGL 3 instructions that are causing problems, its OpenGL 2 / GLSL instructions which have proven themselves in the industry as a solidified de facto standard. Make no mistake, these are not KDE bugs and they're not problems with hardware needing to be too new. These are problems with Open Drivers failing to keep up with even 5 year old standards mostly because they aren't implemented anywhere, and are thus a low priority to fix.

      So when KDE 4.6+ rolls around with the new OpenGL 3 spec, it is going to be extremely problematic for people with Open Drivers. But this is how progress is made. If enough people use KDE 4.6/4.7 with these features and have Open Drivers, bugs will get filed, triaged, and subsequently fixed. The best thing KDE can do for these drivers is to keep setting the bar higher and pushing newer versions of the spec. Driver developers will then respond to those increased expectations, since for the first time there is high demand for said features.

      Comment


      • Originally posted by marek View Post
        ATI have never released a GL2 GPU. Some GLSL features are not supported by hardware, the GL2 non-power-of-two textures are not supported at all. R300 up to R400 don't fully support separate stencil. NV30 has similar limitations (like separate blend equations are not supported).
        Sorry R300 gpu is DX7. It's not a opengl 2.0 gpu and can't be. I think R500 is where you have to go for that. NV30 is nvidia 5 series. That's a dx8 gpu. Not a dx9 gpu and not a opengl 2.0 gpu.
        ATI's x1300 to x1950 and nvidia's 6xxx series should be the bottom line in my mind. Everything else is a decade old and completely unsupported by hardware vendors on any OS except for maybe legacy windows 98 drivers hanging around on their website.
        As much as linux wants to support this stuff and brag about supporting unsupported hardware it's going to make mesa and X11 a mess. Sure it's 180 to 150nm and likely to keep working and working and never die but really it has to be let go as it's a boat anchor to modernization of Linux.

        Comment


        • Originally posted by kazetsukai View Post
          So when KDE 4.6+ rolls around with the new OpenGL 3 spec, it is going to be extremely problematic for people with Open Drivers. But this is how progress is made.
          Progress? Bullshit. There is no reason to demand GL3, there is no feature in GL3 you need for desktop effects. The top-notch games don't need it, so you don't need it either. GL2 and D3D9 are far more powerful than you seem to imagine. Game consoles are stuck in the D3D9 level of features and there are lots of games with very impressive graphics. I don't get why people keep repeating the same GL3 bullshit over and over again, yet still they have probably never read the GL3 spec. Even today, GL3 is still niche in the game industry, and that won't change anytime soon unless Intel and Apple implement GL3 in their drivers/systems, that's more than 50% of the market, plus there are tons of legacy ATI and NVIDIA hardware out there.

          Originally posted by kazetsukai View Post
          Driver developers will then respond to those increased expectations, since for the first time there is high demand for said features.
          Driver developers already respond to much higher expectations than KDE could ever make. There are even games, you know.

          Comment


          • Originally posted by Hephasteus View Post
            Sorry R300 gpu is DX7. It's not a opengl 2.0 gpu and can't be.
            R100 is DX7, R200 is DX8, R300-R500 is DX9. GL2 is somewhere between DX9 and DX10, it's basically DX9 with additional features the DX9 hardware doesn't have. The R300 hardware interface matches DX9 API almost *exactly*, see the r5xx docs, it's all there.

            Originally posted by Hephasteus View Post
            NV30 is nvidia 5 series. That's a dx8 gpu. Not a dx9 gpu and not a opengl 2.0 gpu.
            NV30 is also within limits of D3D9, but isn't within limits of GL2.

            You also seem to be forgetting about Intel, which is approximately 50% of the GPU market and those are mainly D3D9 GPUs too.

            Originally posted by Hephasteus View Post
            ATI's x1300 to x1950 and nvidia's 6xxx series should be the bottom line in my mind. Everything else is a decade old and completely unsupported by hardware vendors on any OS except for maybe legacy windows 98 drivers hanging around on their website.
            You are so wrong. Aero in Windows 7 supports the lowest level of D3D9 hardware even today, including R300, NV30, and Intel GMA.

            I don't think Linux can afford killing its userbase by dropping hardware support. KDE obviously doesn't care.

            Comment


            • Originally posted by Hephasteus View Post
              Sorry R300 gpu is DX7. It's not a opengl 2.0 gpu and can't be. I think R500 is where you have to go for that. NV30 is nvidia 5 series. That's a dx8 gpu. Not a dx9 gpu and not a opengl 2.0 gpu.
              That's incorrect. R300 supports DX9 (SM2.0a) as does NV30. R400 is SM2.0b and R500 is SM3.0.

              OpenGL 2.1 demands some features that are not supported by those GPUs, namely texture repeat/mirror modes for NPOT textures (implemented in R600), separate stencil (R500?) and separate blend equations (Geforce 7 series, IIRC).

              The closed-source drivers for those cards advertize OpenGL 2.1 even if they don't fully support it. The reason they get away with this is that few games/apps use those unsupported features (mainly because they are not all that useful and because they are not that well-supported on those older GPUs).

              DirectX 9 is slightly more in-tune with what the hardware actually supports, mainly because the spec was developed alongside the hardware. AFAIK, the only feature that's required but *not* supported is vertex texture fetch on R500-, which worked around the requirement on a technicality (the card claimed to support VTF but didn't expose any texture formats - which was was allowed by the spec in letter, if not in spirit).

              Comment


              • Originally posted by marek View Post
                Progress? Bullshit. There is no reason to demand GL3, there is no feature in GL3 you need for desktop effects.
                Theoretically, we don't even /need/ compositing for desktop effects. Shit, we don't /need/ desktop effects -at all-, since the poor guys using the VESA driver can't use them. I guess we don't really /need/ 3D support either. What got those things implemented? User demand and tons of hacking followed up by hard work. You're telling desktop developers to not raise the bar? That's irresponsible. Complacency is the worst enemy of progress.

                Originally posted by marek View Post
                Game consoles are stuck in the D3D9 level of features and there are lots of games with very impressive graphics.
                How are game consoles relevant to this, exactly?

                Originally posted by marek View Post
                I don't get why people keep repeating the same GL3 bullshit over and over again, yet still they have probably never read the GL3 spec. Even today, GL3 is still niche in the game industry, and that won't change anytime soon unless Intel and Apple implement GL3 in their drivers/systems, that's more than 50% of the market, plus there are tons of legacy ATI and NVIDIA hardware out there.
                Game consoles don't get hardware upgrades, and PCs generally use DX for gaming anyway. Where are most games now? DX9 compliance. But the work being done on DX9 is stagnant compared to the DX10 and DX11 APIs, because there -are- companies implementing those features. You're complaining that KDE is implementing GL3? Complain that Crysis uses DX10/DX11 while you're at it!

                Perhaps I'm reading this wrong: You're justifying the limited featureset of Open Drivers because game consoles and Apple haven't implemented these technologies? I don't really see those as reasons for not implementing features standardized years ago, in a desktop environment and already implemented in the blobs (which are the only officially supported drivers of the cards by their respective vendors anyway).

                You also seem to be telling me that when alot of people file and complain about a graphic bug, that doesn't push the development teams to get it fixed? lol! Look at compositing support- that was a feature that could be classified as completely unnecessary, and was extremely buggy out of the gate. Now if noone used it, would it have been triaged and prioritized as it did? Not a chance!

                Originally posted by marek View Post
                Driver developers already respond to much higher expectations than KDE could ever make. There are even games, you know.
                Games aren't desktop environments. A game wont make every graphic driver developer turn and say "hey, we need to fix these." A desktop environment? Maybe, since there are probably alot more people with these desktop environments on Linux than there are people playing X game with said features implemented. And if you think you're earning cred by belittling KDE's work then you're kidding yourself.

                Comment


                • Originally posted by kazetsukai View Post
                  Theoretically, we don't even /need/ compositing for desktop effects. Shit, we don't /need/ desktop effects -at all-, since the poor guys using the VESA driver can't use them. I guess we don't really /need/ 3D support either. What got those things implemented? User demand and tons of hacking followed up by hard work. You're telling desktop developers to not raise the bar? That's irresponsible. Complacency is the worst enemy of progress.
                  Give me a list of graphics techniques (requiring GL3) we should have in desktop effects. Then we can talk.

                  Originally posted by kazetsukai View Post
                  Complain that Crysis uses DX10/DX11 while you're at it!
                  I can't complain about Crysis because I know the graphics algorithms it uses and I have already implemented some of them on R500 and can implement the rest of them if I had enough time to spare. Crysis mainly uses DX9, with the DX10 option being a result of their marketing strategy rather than a neccesity.

                  Originally posted by kazetsukai View Post
                  Perhaps I'm reading this wrong: You're justifying the limited featureset of Open Drivers because game consoles and Apple haven't implemented these technologies? I don't really see those as reasons for not implementing features standardized years ago, in a desktop environment and already implemented in the blobs (which are the only officially supported drivers of the cards by their respective vendors anyway).
                  This has nothing to do with open drivers. I am talking from the position of a graphics developer which has experience with graphics algorithms and was actually paid for it, and analyzing what path needs to be taken to have the best desktop experience and the broadest hardware coverage with the most effects possible. GL3 is not part of that no matter what you might think. The age of a standard doesn't matter, the market share does. This is the practical point of view. I have described the uselessness of GL3 for the desktop effects from the technical point of view earlier, and illustrated on games that you can have very impressive graphics even with the oldest and crappiest APIs available.

                  Comment


                  • @kazetsukai: the point is that OpenGL 3.x does not offer any features over 2.1 that are relevant to desktop composition. Floating-point textures, R2VB, TBOs and UBOs, fences, cubemap filtering, OpenCL interop - none of those matters.

                    The only features that are relevant are geometry instancing (a tiny performance improvement if you have thousands of windows open at the same time), multisampled textures (to implement antialiasing, although these will likely have bad side-effects to font rendering so they are not useful) and maybe geometry shaders (I can't imagine why but you never know).

                    In short, OpenGL 3.x does not enable kwin to do anything that cannot be done in OpenGL 2.1. It won't make code simpler (since they'd have to keep the 2.1 and 1.5 fallbacks intact) and it would bring little new to the table.

                    Finally, games *do* matter because they are the driving force behind graphics hardware and drivers. Games would benefit *massively* from wide-spread OpenGL 3.x support but we are not there yet. And if they can made do with OpenGL 2.1, then there's little reason why kwin cannot.

                    My prediction: the next major kwin version will be able to instantiate OpenGL 3.x contexts but will not actually use them for anything just yet (same capabilities as 2.1 with a slightly cleaned up codepath - maybe). It will just lay down the foundation for future work.

                    Comment


                    • What's the difference between GLSL 1.2 and 1.3? I'm simply guessing that's why they are going OpenGL 3?

                      Comment


                      • Originally posted by V!NCENT View Post
                        What's the difference between GLSL 1.2 and 1.3? I'm simply guessing that's why they are going OpenGL 3?
                        GLSL 1.3 renames the "attribute" and "varying" keywords to "in" and "varying", respectively, in order to make room for more shader stages (geometry, etc). It also introduces derivatives (ddx, ddy), a few new texture functions (texture*lod that allows you to control the mipmap it samples from), deprecates built-in uniforms, modifies the syntax for fragment output for easier MRT and adds a couple of new keywords (meant for OpenGL ES compatibility).

                        Again, none of those features is necessary - or even useful - to desktop composition. I can understand kwin wanting a GL3-compatible codebase for future work but I will be very surprised if it offers any new functionality (over GL2) in the immediate future. I certainly cannot see them dropping 50% of their user-base for no tangible benefit.

                        Comment


                        • Well, with radeon (R600c/g) and mesa git I even didn't noticed they raised the opengl requisites. It just worked.
                          With the fucking intel driver it still doesn't work properly(and I'm using latest graphic stack from git). With latest stable intel driver I even cannot start kde to disable compositing!
                          ## VGA ##
                          AMD: X1950XTX, HD3870, HD5870
                          Intel: GMA45, HD3000 (Core i5 2500K)

                          Comment


                          • Originally posted by BlackStar View Post
                            GLSL 1.3 renames the "attribute" and "varying" keywords to "in" and "varying", respectively, in order to make room for more shader stages (geometry, etc). It also introduces derivatives (ddx, ddy), a few new texture functions (texture*lod that allows you to control the mipmap it samples from), deprecates built-in uniforms, modifies the syntax for fragment output for easier MRT and adds a couple of new keywords (meant for OpenGL ES compatibility).
                            DDX and DDY are already in GLSL 1.1.

                            Comment


                            • Originally posted by marek View Post
                              DDX and DDY are already in GLSL 1.1.
                              Oops, just checked the specs and you are right. Although I kinda question their usefulness, since you 1.1 doesn't offer texture*Grad functions.

                              Fake edit to my previous post:
                              - "in" and "varying" should obviously be "in" and "out"
                              - texture*Lod should be texture*Grad. The former exists since GLSL 1.1 (maybe earlier) and allows you to select the mipmap level, but it's the latter that allows you to specify the gradients explicitly (necessary for artifact-free parallax mapping, for instance).

                              Comment


                              • And a slightly off-topic question: does Mesa implement the noise*() functions for GLSL?

                                Comment

                                Working...
                                X