Originally posted by energyman
View Post
Announcement
Collapse
No announcement yet.
AMD Catalyst Will Not Support Wayland Anytime Soon
Collapse
X
-
Originally posted by energyman View Postwho the fuck cares? really? It will take years until wayland is ready for the masses - and I still don't understand what is so great about it.
So if it is years away and doesn't even have an api yet, why should nvidia or amd waste precious (and expensive) ressources on it, as long as their is X11 stuff to be done?
Forgive me if you were being rhetorical.
Everyone that will runs desktop linux, and everyone that manufactures graphics devices 'will care' in approximately two years if Wayland becomes the prominent display server technology. Wayland is supposed to simplify the graphics stack by moving a number of functions to GEM/DRM and manage compositing. With X, compositors are added on top (kwin, compiz, mutter, etc).
My opinion is that Wayland has the potential to be a lot more elegant and performant in the typical "A computer with a display device attached" sceario. It does not seem to address the "window over a network, game over the cloud, automagic remote desktop" as well as X11 currently does.
Nvidia/AMD would allocate resources if they felt that Wayland will gain prominence, and want to be first-to-market.
F
Comment
-
I remember Berlin/Fresco, YWindow and a couple more 'great next things'.
You know why they really failed?
They could not run X programms.
Wayland may be a nice thing, but apart from a nice looking architecture what does it have for the user? Why being excited about something that is a) a long way away b) doesn't even have a stable api yet?
Wake me up if wayland becomes a viable replacement for X. At the moment I can take a 10 year old Xapp and just run it. Do that with wayland and it might have a future.
Until it is ready, all those phoronix postings about drivers supporting it or not are just a waste of ressources.
Comment
-
To be fair, Wayland has a lot more mindshare among important players (Mesa, X.org, RedHat, Intel, etc.) than Fresc or Y Windows ever did. Even something as broken as pulseaudio and unity was turned into a quasi-standard with proper backing, so it's not unlikely that the same could happen to Wayland in due time, which seems to be far better thought-through than those.
Still, it's a hobby research project at the moment, and even if it does come to replace X on some of the desktop-oriented distros, it won't be any time soon. It's just one of those things phoronix people get all excited about, like monorails on slashdot.
Comment
-
Originally posted by LinuxID10T View PostYou did see how I said primitive, right? MSAA just really isn't very good compared to the newer AA methods available.
Course, neither really compares to a higher res screen. The death of AA can't come soon enough, in my opinion.
Comment
-
Originally posted by energyman View PostI remember Berlin/Fresco, YWindow and a couple more 'great next things'.
You know why they really failed?
They could not run X programms.
Wayland may be a nice thing, but apart from a nice looking architecture what does it have for the user? Why being excited about something that is a) a long way away b) doesn't even have a stable api yet?
Wake me up if wayland becomes a viable replacement for X. At the moment I can take a 10 year old Xapp and just run it. Do that with wayland and it might have a future.
Until it is ready, all those phoronix postings about drivers supporting it or not are just a waste of ressources.
Comment
-
Originally posted by ownagefool View PostMSAA offers better image quality at a higher performance cost and is still the defacto AA stadard. The newer AAs are almost all based around doing a quick messy job and the choice is yours where you draw the performance / image quality line. It has nothing to do with MSAA being primitive.
Course, neither really compares to a higher res screen. The death of AA can't come soon enough, in my opinion.
Comment
-
Originally posted by ownagefool View PostCourse, neither really compares to a higher res screen. The death of AA can't come soon enough, in my opinion.
We'll likely see resolutions stabilize at 2k and 4k, followed by exponential increases in scene complexity as outward scaling GPUs and CPUs begin to share memory.
After that, it's probably going to paradigm shift towards LF rendering, or voxels/rays, or some other emergent fringe tech. I've seen a number of compelling LF expositions out of Stanford and MIT in the past few years. Neat stuff.
Comment
-
Originally posted by LinuxID10T View PostEven on 300+ DPI screens, AA is noticeable.
If I cannot perceive the pixels, there's no "aliasing" to begin with when rendering at the screen's native resolution. Perhaps "when rendering at the screens native resolution" is the cause of our opposing perceptions. I would concede that your statement holds true when rendering below the native resolution, as we can demonstrate its benefits without high-ppi displays.
F
Comment
Comment