Originally posted by WannaBeOCer
View Post
Announcement
Collapse
No announcement yet.
X.Org vs. Wayland Linux Gaming Performance For NVIDIA GeForce + AMD Radeon In Early 2023
Collapse
X
-
Originally posted by piotrj3 View Post
Neither does Wayland takes advantage of modern hardware (good enough) because almost entire graphic stack is implicitly synced (like old directx/opengl). Modern stack like windows WDDM, Mantle, Vulkan, DX12, Metal is explicitly synced because you need that for multi-threading, parallel and dropping unnecessary synchronization fences when you make a lot of draws. Thing is, Apple (macs), Microsoft (since Vista), Google (android) were for long time preparing underlying technology in kernel and other low level stuff in explicit way.
Thing is for years GPUs are already designed in mind of that and there are hardware schedulers in them made in mind of that. Wayland (When technically it can run explicit way, but doesn't because most of linux stack isn't ready). So until we flip the switch to (almost) entirely move to explicit synchronization Wayland (at least comparing to graphics stacks of competition to linux) is nothing modern.Phantom circuit Sequence Reducer Dyslexia
- Likes 1
Comment
-
Originally posted by qarium View Post
for many apps like blender cuda is no longer a feature because ROCm/HIP runs Blender to. of course then you say OptiX is faster right but amd already works on this part to.
- Likes 2
Comment
-
Originally posted by andyprough View PostRedhat/IBM, Canonical, SuSE don't give a crap about selling workstations to governments - all they care about is selling the cloudy bits, that's where the money is. Governments are still running loads of stuff on old Windows 95 and Windows XP boxes and so forth - what is this nonsense about selling governments on security?
Wayland exists, just like so many projects in modern Linux, because devs get paid more money if they run around to conferences screaming about "security" all the time. Even if there's never been an exploit. The same reason all of our cpu's are mitigated against theoretical security threats that have never been exploited. Security Theater pays big bucks, keeps a lot of people employed.
and of course these secret government contracts make explicit use of security clauses...Phantom circuit Sequence Reducer Dyslexia
Comment
-
I'm currently playing Transport Fever 2, which uses SDL2, on Wayland. Good to know from these results there shouldn't be any performance penalty for using Wayland although my 3700X is probably holding performance back a bit, even at 2160p (radeontop reports about 80% GPU utilisation during play).
image.png
Edit: more like 60% but it's a little stuttery, hence I suspect a CPU bottleneck.
Comment
-
This is a proof-of-concept Wayland keylogger that I wrote to demonstrate the fundamental insecurity of a typical Linux desktop that lacks both sandboxing (chroot, cgroups, ...) and mandatory access control (SELinux).
...
This program is in no way meant as criticism of the Wayland project. It simply demonstrates that creating a secure desktop requires more than just a few server-side restrictions.
By the way, this inherent weakness is not at all specific to Linux. Similar techniques would also work on Windows and Mac, and essentially any platform that doesn't sandbox applications.
Furthermore, people have linked you to many instances of X11 vulnerabilities being found and exploited. You seem to think that these don't count because nobody has decided to do anything malicious with them. That's a dumb way to think. That's like saying it's not a problem if everyone's passwords were displayed publicly on their profiles... until someone decides to use them to get and leak private information from users.
- Likes 4
Comment
-
Originally posted by piotrj3 View Post
Because so many games that can actually be gpu bound on modern graphics card uses those....
Does Unity support Wayland? Unreal Engine? Wine? DXVK? Currently only unofficial forks of Wine support Wayland sort of. Everything else is "no".
Yes (SDL2).
About to get merged.
Does DXVK really need wayland support? It's just a translator.
Comment
-
Originally posted by Myownfriend View PostIn other words, it's meant to show that it's not enough to have only one level of security. That is exactly what people have been telling you and is a case for Wayland. It's not a win for X11 to say that someone found an exploit that allows you to do in a Wayland session what can be done in Xorg without an exploit. The fact that they said this could be done on Windows and MacOS as well should have told you that you probably misunderstood what what was being said on that Git.
The supposed security in Wayland is a pathetic excuse for the lack of features in Wayland itself, which is ridiculous.
The truth is simple. There are no security extensions in X just because no one needs them. For most users, it is an unnecessary clutter. People have even written specifications for implementing security in applications. Do you know how many applications adopted them? Zero.
P.S. The permissions system should also validate the client for LD_PRELOAD and LD_LIBRARY_PATH overrides, so that an intruder cannot access the data through the clients.Last edited by Monsterovich; 30 January 2023, 09:47 PM.
- Likes 5
Comment
-
Originally posted by avis View PostSpeaking of X.org "insecurity" - not a single documented incident for its over 30 years of existence.
Speaking of Wayland "security" - Wayland compositors are executed by the user account (not root and its own memory space) and can be trivially hijacked.
Speaking of
This is blatant lies. The US government and three-letter agencies do use RHEL. Here's some useless agreement which I've just made up.
More info: Red Hat Enterprise Linux 7.x: EAL4+
don't believe it ? you better do.
its works like this NSA who use Redhat discovers a hypothetical cyber security threat (even of no one ever did use this security hole) and then they go to IBM/Redhat and pay them money to fix it... and the fix for x11/xorg is wayland...
and for the NSA stuff like this: "not a single documented incident for its over 30 years of existence." does not count the only thing what counts is it hypothetical possible or not. then they go and pay IBM/redhat money to fix the problem.
Phantom circuit Sequence Reducer Dyslexia
Comment
-
Originally posted by WannaBeOCer View PostOptiX is practically everywhere, Nvidia has Tensor cores. While AMD finally added their AI accelerators to their consumer cards in the RX 7000 series. Only reason to use AMD GPUs is for gaming currently while everything else will be quicker on Nvidia GPUs and have been since 2018 thanks to Turing. For these task the best bang for the buck of a users hard earned money is Nvidia.
as i know in 1-2 month amd will release their raytracing extension to Blender means their OptiX alternative., (of course the nvidia cards are faster in raytracing please spare me this part.)
you can buy your nvidia card i don't care.
i will upgrade my vega64 to an 7900xtx in the next months.Phantom circuit Sequence Reducer Dyslexia
- Likes 4
Comment
Comment