Originally posted by RealNC
View Post
Announcement
Collapse
No announcement yet.
More Details On The OpenGL Situation In KDE's KWin
Collapse
X
-
Originally posted by whizse View PostDebian stable comes with 4.4. 4.6 and 4.7 is available from unstable and experimental. The whole point of a stable Debian release is that should be just that, stable.
So this KWin developer is unable to test with the open drivers because Debian stable doesn't provide them. But he is a developer after all, so why not pull what's needed from Debian testing/experimental then?
People seem to think that driver devs should test their code with all apps out there (huge task). I say KWin devs should test theirs too with both the blob as well as the open drivers (small task). And then report back what doesn't work. Am I the only one to think that this is the best approach? There's a huge swarm of applications out there, but only a small set of drivers. Having the application devs test their code is much easier and quicker than having the driver devs test theirs with all apps out there.
Comment
-
That number seems unlikely.
Originally posted by kraftman View PostDamn troll. Check this in Gnome. It's made mainly in devs free time. So you want to tell me that Gnome tries to "compete" being developed mainly in free time? And about volunteers:
"So you want to tell me that Gnome tries to "compete" using only a small number of volunteer devs? Btw. how did you figure out such stupid conclusions?
To argue that Gnome is less corporate seems a strange thing to attempt...
Comment
-
Let me say that I don't have a horse in this race.
Originally posted by pingufunkybeat View PostTemplar, stop trolling.
It's incredible what this thread has turned into. The situation is quite simple, really:
- Mesa advertises features that are known to be broken. The driver guys should think about this
- The communication between the KWin team and the Mesa team was not really good in this case. All the devs should think about this instead of pointing fingers.
I mean, it's obvious that someone would march in here with a prepared "KDE=SATAN!" diatribe, but it's just a friggin optional eye-candy plugin, and you have to manually disable the blacklist to even run it.
Basically, driver development is, apparently, much harder than WM development, and these companies don't want to pay for more devs, so we have to work with them. So, replace/help your dev, or pay for some more X/mesa coders..
Comment
-
...or recognize that when there is a mismatch between hardware granularity and OpenGL extension granularity you are going to have cases where partially implemented extensions are the most practical option (not exposing the extension => apps don't run, fully exposing the extension => apps run real slow because of software emulation).
If I may offer a couple of examples :
1. A lot of DX9-family chips don't exactly match the OpenGL 2.x requirements, since they were designed *after* DX9 was released but *before* OpenGL 2.x was released. They do, however, have enough functionality to implement a subset of GL2 extensions that works for most applications. In some cases the situation can be improved by patching in complex shader code, but that brings both performance and compatibility problems, and in most cases the only real solution is to fall back to software rendering for that extension.
What should the driver developers do ?
2. When developing support for new GPUs, the work tends to continue for a year or so, going from "a few apps work" to "pretty much everything works". Most of the time is spent in a "most apps work" state, where most of the advertised extensions work enough to let apps run well but the extensions don't pass all tests and all options, and probably won't for another 6-12 months.
Should the extensions be exposed ?Test signature
Comment
-
Originally posted by bridgman View Post...or recognize that when there is a mismatch between hardware granularity and OpenGL extension granularity you are going to have cases where partially implemented extensions are the most practical option (not exposing the extension => apps don't run, fully exposing the extension => apps run real slow because of software emulation).
If I may offer a couple of examples :
1. A lot of DX9-family chips don't exactly match the OpenGL 2.x requirements, since they were designed *after* DX9 was released but *before* OpenGL 2.x was released. They do, however, have enough functionality to implement a subset of GL2 extensions that works for most applications. In some cases the situation can be improved by patching in complex shader code, but that brings both performance and compatibility problems, and in most cases the only real solution is to fall back to software rendering for that extension.
What should the driver developers do ?
2. When developing support for new GPUs, the work tends to continue for a year or so, going from "a few apps work" to "pretty much everything works". Most of the time is spent in a "most apps work" state, where most of the advertised extensions work enough to let apps run well but the extensions don't pass all tests and all options, and probably won't for another 6-12 months.
Should the extensions be exposed ?
2. Kwin's issues seem to be with older "pretty much everything works" hardware, not with beta R800/Nouveau-level 3d support.
Originally posted by HephasteusHis blog is wrong. You can't do blur's and transparent windows without opengl 2.0. You can it's just too hard and not worth it. Which is why it was DONE differently on DX9 Opengl 2.0 2.1 gpu's.
You have many options: mipmap blur (GL1.1), render-to-texture blur (*), gaussian blur (GL1.4 with ARB shaders or GL2.1 with GLSL shaders), take your pick.
(*) render-to-texture is available on most GL1.5+ GPUs, as long as the drivers don't suck.
Comment
-
Originally posted by BlackStar View Post1. This doesn't seem to be the matter here. Were it so, the blobs would have faced the same issues (they run on the same silicon after all).
2. Kwin's issues seem to be with older "pretty much everything works" hardware, not with beta R800/Nouveau-level 3d support.
The older "pretty much everything works" hardware is exactly the "not quite OpenGL 2.x but everyone expects it anyways" hardware I talked about in (1). That is the worst case scenario for OpenGL support, since the DX and OpenGL specs diverged the most at that point (both in time and in granularity) and most of the hardware was designed around DX specs since the corresponding OpenGL specs did not exist.Test signature
Comment
Comment