Announcement

Collapse
No announcement yet.

It's Time To Admit It: The X.Org Server Is Abandonware

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • angrypie
    replied
    Originally posted by tildearrow View Post
    Because it's pretty much complete and stable.

    The protocol has flaws, but Wayland still is a work in progress, lacking some important features...

    (and a secret plot to GNOMize or destroy KDE)
    Came here to say this, but GNOME won't remain the only semi-usable Wayland DE for long, they're so far ahead into the game because they started first. Devs will eventually agree on a way to share screens and such.

    In fact I think the demise of X will force different minds to some unprecedented collaboration instead of what we had until now, people working around X's limitations and not giving a fuck if other DEs/WMs had to do the same. If that happens we'll see more consistency, radically different DEs behaving more or less the same for arcane stuff (which has been a flaw of the Linux desktop since its inception).

    Leave a comment:


  • Ironmask
    replied
    Originally posted by milkboy View Post
    This discuss is getting looong.

    IMO Xorg will be commercially abandoned, but will continue as a community project.
    just because it has no commercial support does not mean it is abandonware, openssl has been running with just pure donation after all, not even commercially supported prior heartbleed.

    I mean, this sounds like many previous discussion in the past to me.
    like GNOME2(now MATE), Systemv, Flash, OSS sound server, etc....

    even thou its been abandoned by the big players, few still used it.
    so will it be an abandonware any time soon(in less than 5yrs)? nope, many people dislike change.
    will it be abandonware 10yrs+ time? yes, nothing last forever.

    Article title could be made better like "The X.Org Server Is Commercially Abandonware"
    All but MATE is actually an active project. Everything else you listed is actually dead. Nobody wants Flash anywhere near their systems, and only fringe schizophrenics use SysVinit in Linux.
    You're conflating commercial interest with interest in general. Nobody wants to work on X11, it's just a complete mess. When Solaris of all things has more hobbyist support than X11, you know things are bad.

    Leave a comment:


  • dragon321
    replied
    Originally posted by R41N3R View Post

    I use Firefox Dev on Wayland too, but now videos are flickering and often there is just a black screen. Menues of addons are mostly white boxes. The close button in CSD mode is broken. Scaling is not working. And the basic renderer flickers terribly on Plasma Wayland. Then Firefox just shows again only a Wayland icon. Just look at the bug reports before asking people what they are talking about ;-)
    You can find bug reports for basically everything but it doesn't mean it is problem for anybody.

    Originally posted by bearoso View Post
    It was a few months back. Parts of the UI didn’t work right, and websites didn’t draw completely. I’ll give it a shot and see if it’s improved.
    I don't remember for sure when I switched for Wayland but as I said it's perfectly reliable for me now.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by blacknova View Post
    After all if no standard compositor present why bother with compatibility? Are we even sure that couple of years ahead KDE and GNOME compositors will be compatible enough to run each others applications?
    The answer to this will most likely be yes KDE and Gnome compositors will be able to run each other applications. Because they do work with each other in this area they learnt a lot from the artsd vs esound problem.

    But there is a bigger problem will non KDE/Gnome compositors be able to run KDE/Gnome applications this could be a bigger problem child. Both KDE and Gnome on Linux are going down the path of systemd user session management. Like really KDE and Gnome applications not working with Weston the reference Wayland compositor without major messing around is really possible.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by tildearrow View Post
    False. The compositor adds latency, like it or not. Always. There is no way this can be avoided, as we cannot predict exactly how long will the compositor take to draw the next frame.
    That is absolutely not true for a Wayland Compositor. Exclusive Full Screen can be split in 2. Wayland compositor can do a zero copy buffer between the Application and GPU. That zero copy buffer in Wayland protocol design means the Wayland compositor does not have to draw the next frame at all if it has a full screen buffer from something. So you don't need Full Screen Exclusive mode as you need in X11 when you have zero copy buffer option. The zero copy buffer comes from EGL or Vulkan and was designed to be used with the wayland protocol.

    With wayland the time for the compositor to draw next frame can be as low as zero due to the zero copy buffer effect. So in a well behaving wayland compositor an application covering the full screen should automatically have the Wayland compositor step out way. This is not like X11 compositor where you have to do Full Screen Exclusive and ask the compositor to step out way and allow a zero copy buffer. Zero copy buffer part of Exclusive Full Screen is basically built into the Wayland protocol and should be their if the compositor is implemented well.

    The big thing you don't really want is the Exclusive Full Screen bonus of direct control over screen resolution this is not always a good thing.

    Originally posted by tildearrow View Post
    Sure, wasted bandwidth and power. The dedicated circuitry on a separate monitor is often more power efficient, and guaranteed to deliver the scaled image in time.
    Few errors here when you get into using TV as monitors you will find out that they are not always deliver the scale image on time if you are not using the common resolutions. Yes some monitors can skip frames on different resultions and is really really horrible,


    Originally posted by tildearrow View Post
    Furthermore you did not consider this. What if the laptop cannot output... say... 4K at 60Hz but rather only 30? So what? Should I live with the lower framerate?!
    There is no 100 percent perfect answer here. I used 4k as example but you can have that you also want to hide the fact you have 4K from application.

    Originally posted by tildearrow View Post
    The problem is that if you use the graphics card for scaling the game you cannot be sure that the scaling time is constant. So there is a chance it could potentially miss a frame if the game is loading the graphics card too much.
    This depends on the graphics card. Some GPU have decanted scaling circuits exactly like a monitor. Think LCD panel with no brains directly connecting to the GPU. So particular GPUs you can be 100 percent sure that GPU scaling is 100 percent constant.

    Originally posted by tildearrow View Post
    The scaling hardware on a monitor is guaranteed to scale the image and never miss a frame.
    This is a myth this is not in fact always true.


    Some monitors you do a frame skip test on and they are skipping you can get them replaced under warranty others like TV where you are attempt to use a resolution that is not standard broadcast you have to live with the problem. So in a case you have a monitor with a hardware scaling issue on a particular resolution you will want to disappear that resultion.

    Originally posted by tildearrow View Post
    OK, please look. 1080p does take less time to render than 4K, but what if I am still being bottlenecked? If we do the 1080p to 4K scaling on the graphics card, this means:
    This is going to be fun. Stack of myths based presumes. Lot of these are depend on the GPU

    Originally posted by tildearrow View Post
    - less performance
    Depends on the GPU some GPU the buffer scale circuit is part of the CRT out path so always run so makes no performance difference at all. Of course if you are using shaders this is true.

    Originally posted by tildearrow View Post
    - more memory usage (yeah, just to do the scaling)
    Again this is depend on the GPU. The GPU with buffer scaling in the CRT path there is no more memory usage due to the fact the memory is always allocated..

    Originally posted by tildearrow View Post
    - we do not know how long will it take (what if the card runs out of memory, has to destroy the scaled buffer (~32MB) and recreate it when the time to scale comes?)
    One of the reasons why you will rendering at 1024p and upscale 4K is the fact you don't have enough GPU ram to render 4K complete scene in the first place.

    Yes it true we don't know in the generic case for any random GPU how long GPU buffer scaling will cost. The somewhere between zero cost and lots of cost depending on the GPU.

    You cannot expect application developers/game developers testing on limit hardware to be getting this right all the time.

    Originally posted by tildearrow View Post
    Are you saying I should just buy a new monitor?
    Why are you thinking backwards?! Where did compatibility go?
    Being able to use your GPU to pretend you have VRR monitor when you don't means you can reduce your need to buy new monitor. As you are able to get the CPU and GPU reduction the application gets by using VRR methods without having a VRR monitor. Yes without abstracting the monitor away from the application the only way you can get this benefit is buy new monitor.

    Originally posted by tildearrow View Post
    Poor performance, sure. Remember the Windows Aero days?
    People playing games on full-screen often experience higher performance than windowed/composited (more so on low-end systems).
    Yes I do and that is a X11 like compositor in Windows. That not a wayland compositor that has access to zero buffer actions. Wayland a window covering full screen if the compositor is done right should instant gain the performance of being executive full screen. There is more than 1 way to solve the problem.

    Also something you miss is using the layer system in GPU so you can have part screen window that is change ie game and that buffer is going straight on to a output layer that is welded into single image in the CRT out part of the GPU. So lot of ways you want excursive output layer in the GPU as that basically gives you the same performance as Full Screen Exclusive while being a window. X11 compositors and Windows compositors are not designed to take advantage of output layers in GPU to their fullest.

    Output layers is one of those fun things lot of people would not have a clue that when most GPU makes what it sends to monitor its always fusing 4 different buffers into 1 using a simple alpha masks. The first extra layer in GPU added was the mouse curses layer.

    Leave a comment:


  • blacknova
    replied
    Well, I think it is really possible to avoid excessive copying from full screen application by presenting the exact buffer application provided. That is what GBM and probably EGLStreams are all about.

    And even resolution, refresh rate adjustment and page over presentation control can be coded in as extensions for Wayland.

    The problem is - I'm not sure if there is standard for these extensions defined or every wayland compositor will try to do it on its own.

    In the end If this will came to it - Valve can form some ready to use compositor, slap their own extensions on top and just provide Steam Linux users with "Big Picture" mode with separate login... joy.

    After all if no standard compositor present why bother with compatibility? Are we even sure that couple of years ahead KDE and GNOME compositors will be compatible enough to run each others applications?
    Last edited by blacknova; 27 October 2020, 05:47 PM.

    Leave a comment:


  • tildearrow
    replied
    Originally posted by oiaohm View Post
    This is where you want to set the resolution not some random application.
    Now what if I want the game to set the resolution? Do I lose choice?

    Originally posted by oiaohm View Post
    4k monitor with a 1080p game needs to be scaled somewhere.
    Yep.

    Originally posted by oiaohm View Post
    I am not saying that I said valve does not want applications with means to set resolution not that user should not be able to set screen resolution.
    Well you did say this before let me quote you.

    Originally posted by oiaohm View Post
    So directly control screen is no longer such a hot idea.
    Originally posted by oiaohm View Post
    It is absolutely not that straight forwards. Not giving application Exclusive Full Screen access does not benchmark out as higher latency. Not letting application output directly in lot of cases is lower latency.
    False. The compositor adds latency, like it or not. Always. There is no way this can be avoided, as we cannot predict exactly how long will the compositor take to draw the next frame.

    Originally posted by oiaohm View Post
    This is one of those things people will be yelling about "Exclusive Full Screen" without understanding the problem.
    ..................

    Originally posted by oiaohm View Post
    I will go back to the 4k monitor with 1080p game. Where is your fastest scaling circuitry most likely your GPU.
    Sure, wasted bandwidth and power. The dedicated circuitry on a separate monitor is often more power efficient, and guaranteed to deliver the scaled image in time.

    Furthermore you did not consider this. What if the laptop cannot output... say... 4K at 60Hz but rather only 30? So what? Should I live with the lower framerate?!

    Originally posted by oiaohm View Post
    So if you are measuring from mouse click to pixel on screen lighting up not giving the 1080p game Exclusive Full Screen access having something like game-scope force the scaling into GPU results in faster in most cases because the circuit to scale in the monitor is not that great.
    The problem is that if you use the graphics card for scaling the game you cannot be sure that the scaling time is constant. So there is a chance it could potentially miss a frame if the game is loading the graphics card too much.
    The scaling hardware on a monitor is guaranteed to scale the image and never miss a frame.

    Originally posted by oiaohm View Post
    Even a new game that has the means to-do 4K you may not want it to know that. Again this is latency. 1080p takes less GPU time to generate frame than 4K in fact there is a big enough difference that you can upscale 1080p in the GPU to 4K and need less GPU power than a 4K render of course.
    OK, please look. 1080p does take less time to render than 4K, but what if I am still being bottlenecked? If we do the 1080p to 4K scaling on the graphics card, this means:

    - less performance
    - more memory usage (yeah, just to do the scaling)
    - we do not know how long will it take (what if the card runs out of memory, has to destroy the scaled buffer (~32MB) and recreate it when the time to scale comes?)

    Originally posted by oiaohm View Post
    Next that not support "monitor does not support VRR/Adaptive-Sync/FreeSync" this is you have not though about this. Applications that support VRR on a VRR screen lot of cases use less GPU and CPU time producing the result. So you absolutely want to be able to fib to the application that it has VRR monitor and have the video card fake it on a non VRR monitor. Also you want to fake up the reverse application that does not support VRR but you have VRR monitor you will still want to at times generate a VRR output this can be the difference with a cheap monitor in a hotish day between between the monitor locking and it working perfectly.
    Are you saying I should just buy a new monitor?
    Why are you thinking backwards?! Where did compatibility go?

    Originally posted by oiaohm View Post
    The reality is you don't want applications in most cases with Exclusive Full Screen access. Its really simple to miss a well made wayland compositor can have zero frames of over
    Please stop with this. The compositor ALWAYS adds latency in one way or the other, by doing an unnecessary copy even with a full-screen application!

    Originally posted by oiaohm View Post
    Its also really simple to ignore all the above cases with giving Exclusive Full Screen access to application results in bad performance. The most common result of Exclusive Full Screen access to application is poor performance.
    Poor performance, sure. Remember the Windows Aero days?
    People playing games on full-screen often experience higher performance than windowed/composited (more so on low-end systems).

    Originally posted by oiaohm View Post
    Big thing here is you want user in control of what resolution the screen is and the lie that the application believes it has in monitor this is not what history Exclusive Full Screen access means.
    Ughhhhhh oiaohm -_-
    Last edited by tildearrow; 27 October 2020, 05:04 PM.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by tildearrow View Post
    Stop. What if I want to really set the resolution?
    This is where you want to set the resolution not some random application.

    Originally posted by tildearrow View Post
    Say, if I bought a 4K monitor and want to use it with my laptop but at 1080p because it is not powerful enough? What about the refresh rate? What if my monitor does not support VRR/Adaptive-Sync/FreeSync?
    4k monitor with a 1080p game needs to be scaled somewhere.

    Originally posted by tildearrow View Post
    Don't tell me I should not be able to set the resolution under Wayland, that's stupid!
    I am not saying that I said valve does not want applications with means to set resolution not that user should not be able to set screen resolution.

    Originally posted by tildearrow View Post
    And "Exclusive Full Screen access Valve does not want applications with"?! That means extra latency, by not letting the application output directly to the screen!
    It is absolutely not that straight forwards. Not giving application Exclusive Full Screen access does not benchmark out as higher latency. Not letting application output directly in lot of cases is lower latency.

    This is one of those things people will be yelling about "Exclusive Full Screen" without understanding the problem.

    I will go back to the 4k monitor with 1080p game. Where is your fastest scaling circuitry most likely your GPU. So if you are measuring from mouse click to pixel on screen lighting up not giving the 1080p game Exclusive Full Screen access having something like game-scope force the scaling into GPU results in faster in most cases because the circuit to scale in the monitor is not that great.

    Even a new game that has the means to-do 4K you may not want it to know that. Again this is latency. 1080p takes less GPU time to generate frame than 4K in fact there is a big enough difference that you can upscale 1080p in the GPU to 4K and need less GPU power than a 4K render of course.

    Next that not support "monitor does not support VRR/Adaptive-Sync/FreeSync" this is you have not though about this. Applications that support VRR on a VRR screen lot of cases use less GPU and CPU time producing the result. So you absolutely want to be able to fib to the application that it has VRR monitor and have the video card fake it on a non VRR monitor. Also you want to fake up the reverse application that does not support VRR but you have VRR monitor you will still want to at times generate a VRR output this can be the difference with a cheap monitor in a hotish day between between the monitor locking and it working perfectly.

    The reality is you don't want applications in most cases with Exclusive Full Screen access. Its really simple to miss a well made wayland compositor can have zero frames of overhead when sitting in the middle. Its also really simple to ignore all the above cases with giving Exclusive Full Screen access to application results in bad performance. The most common result of Exclusive Full Screen access to application is poor performance.

    Big thing here is you want user in control of what resolution the screen is and the lie that the application believes it has in monitor this is not what history Exclusive Full Screen access means.

    Leave a comment:


  • Ignacio Taranto
    replied
    Originally posted by bobbie424242 View Post
    Wake me up when I can run Sway without telling me that "--my-next-gpu-wont-be-nvidia".
    Until then, Xorg works just fine with i3.
    That command line flag is actually hilarious.
    I use both i3 and Sway, I use the same base config file for both. The best of both worlds...

    Leave a comment:


  • Danielsan
    replied
    Originally posted by Awesomeness View Post

    You obviously do not have the slightest idea what you're talking about. Not only do NVidia drivers support Wayland via EGLStreams, it is Red Hat who wrote EGLStreams support in Gnome and XWayland. To claim the polar opposite just shows how incredibly ignorant you are. 🤦‍♂️
    This link says that everything it is just an initial state...

    Leave a comment:

Working...
X