Originally posted by MastaG
View Post
Announcement
Collapse
No announcement yet.
Wayland Protocols 1.38 Brings System Bell, FIFO & Commit Timing Protocols
Collapse
X
-
Well, you wouldn't know what they know as you admittedly aren't familiar with said projects.
Leave a comment:
-
I think I'll listen to the multi-decade Linux graphics stack developer over someone who admittedly has little wayland or gnome knowledge.
Leave a comment:
-
Originally posted by Uiop View PostYou are actually arguing that many Wayland clients should include some frame-rate detection functionality (which might not be reliable at all).
I don't believe your claim that compositor has no extra information. It has more direct access to the GPU. It can better estimate the compositing delay and compositing complexity. It knows about all the other connected Wayland clients, which can affect the preferred timing of frames.
In triple buffering, do the clients have problems with using too high frame rates and causing too high GPU utilization? I've heard of those problems.
In both double and triple buffering, are the clients really achieving minimum possible latency? Without good information on frame timing, they cannot do it.
The client can keep track of these timestamps for each frame:
1. When it starts working on the frame.
2. When it submits the frame to the compositor (calls wl_surface_attach + wl_surface_commit and flushes the display connection).
3. When the GPU finishes drawing the frame (GL/Vulkan timestamp queries).
4. When the frame was presented (Presentation time protocol).
From these, it can estimate / probe:
5. How long before the presentation time the compositor deadline is.
And from that determine when it needs to start working on the next frame to achieve minimal latency. (Mutter is doing the moral equivalent of this to minimize its own latency impact)
A new protocol which tells the client the compositor's prediction of item 5 for the next refresh cycle wouldn't make any difference for items 1-4. Even for item 5, it's still just a prediction, which may turn out wrong. I.e. it's just a minor quantitative difference, not a qualitative one.
In summary, you're barking up the wrong tree here. You should rather poke client developers to take advantage of the tools already available to them. No Wayland protocol can magically take care of it for them.
- Likes 1
Leave a comment:
-
Originally posted by Uiop View PostFirst of all, the decision to enable/disable subpixel aliasing has nothing to do with display resolutions (as someone pointed out).
But, noone pointed out that it is only partially related to "pixel density"; i.e. the eye-to-display distance should also be taken into account.
Therefore, it is impossible to automatically detect whether "HiDPI" mode should be used, instead, it should be a user's decision.
Let users specify the logical pixel density. Let users specify the subpixel orientation of each displays manually. All those "Hardware lies! We can't detect them correctly!" excuses will be moot.
Leave a comment:
-
-
-
Originally posted by Uiop View PostI'm strongly against such assumptions about what clients should "guess".
In a good protocol, the client says what it wants to the compositor, and the compositor responds with information requested by the client.
Deviating from such principles eventually results in a bad protocol.
So, from my point of view, client requests estimations of future frame times, and then gets those estimates from the compositor.
You should try to de-couple the protocol from your wild guesses about what the user's hardware can do.
Experience shows that trying to anticipate the future in a protocol (or API) tends to be a mistake, because things tend to evolve differently than we expect. A good protocol works well for the real world at present and is extensible for future needs.
Anyway, if that "presentation protocol" is done right, it should also be able to solve some of Wayland's problems with double buffering and triple buffering also.
- Likes 2
Leave a comment:
-
Originally posted by Uiop View Post
About copying macOS desktop: I personally don't like macOS desktop or Gnome Shell, but I guess there is about 20% of people who do (i.e. the users of macOS), so a similar desktop working atop of GNU/Linux is a good way to attract those users.
Other than that, I don't see how is GNOME copying macOS?
About HiDPI and the link you have posted: My display is set to "slight subpixel antialiasing", because I use something in-between HiDPI and normal. Given the current desktop resolution sizes, I would guess that most users are similar to me here, with a wide variation. So, it is impossible to have one default setting which fits everyone, and, naturally, there are going to be fights about the default setting for subpixel antialiasing.
Therefore, I can't take the linked discussion as an evidence of anything extra-ordinary.
I agree that with high enough DPI, subpixel AA can retire. But the market share is far from that in 2024. No, the market share won't be there yet even we fast forward to 2034. It takes pixel density of 250%+ (of the logical 96dpi) for subpixel AA stop being useful. Most computer monitors are between 100% and 200%. Monitors in classical DPI range are going to stay for perhaps another 10 to 20 years.
Because the decision is made in GTK, they are forcing their aesthetics beyond the GNOME DE users. Oh yeah, those arrogant fanboys will call for abolishment of "hobby" DEs. I can hear they coming.
- Likes 1
Leave a comment:
Leave a comment: