Originally posted by shmerl
View Post
Announcement
Collapse
No announcement yet.
NVIDIA Continues Discussing Their Controversial Wayland Plans With Developers
Collapse
X
-
Reading the discussion it at least seems civil. Maybe I'm looking too much into it but Nvidia said they knew this would be controversial, so my guess is that they don't know how to do this properly so they just threw a patch at the mailing list in hope to have a discussion on how to best solve this problem on both sides.
- Likes 3
Comment
-
Originally posted by 89c51 View PostAnd thats exactly the reason Valve wont be installed on my machine. 64bit.
btw, x86_64 systems are perfectly capable of running x86 code, they were designed with backwards compatibility in mind. but it seems you are using some broken distro and bragging about it.
- Likes 2
Comment
-
Originally posted by Rubble Monkey View Post
Is there any advantage to the way NVIDIA is doing it or is it merely preference?
- Likes 1
Comment
-
Nvidia isn't implementing a design meant for nvidia. They simply don't like the current method of doing things.We (NVIDIA) clearly think EGLStreams is a good direction for expressing buffer sharing semantics. In our ideal world, everyone would implement these extensions and Wayland compositors would migrate to using them as the generic vendor-neutral mechanism for buffer sharing
- Likes 1
Comment
-
Ehhem, what is wrong with using EGL? And if this won't be mainlined, wouldn't it be easy to implement an extension/plugin package named something like "wayland-nvidia" or "wayland-EGL" or something that simply adds support for Nvidia's chosen approach? Also wasn't open source software about more "developer and user freedom" to begin with? Nvidia clearly thinks this is the way to do things, who are we to tell them "No, you're not allowed to do it this way." when we constantly blather about "freedom" in Linux and the open source world? Are we hypocrites now?
Also whats wrong with having an alternative way of using EGL for buffer management rather than GBM? Couldn't it just simply perform faster when you are on hardware that supports it?
I don't see the problem, I just don't, I'm probably going to be buying AMD next because I generally don't like Nvidia's way of things over the past couple of years, but this is not one of the things I find "controversial" about them... I don't see any problem here... Unless there are some serious downsides to using EGL for buffer management, I think Nvidia should stand their ground.Last edited by rabcor; 03 April 2016, 04:04 PM.
- Likes 1
Comment
Comment