Announcement

Collapse
No announcement yet.

X.Org Server Adds "AsyncFlipSecondaries" To Deal With Crappy Multi-Monitor Experience

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • billyswong
    replied
    Originally posted by binarybanana View Post
    I wonder how does Wayland handles windows spanning mixed refresh rate monitors? Say one 60Hz and one 75Hz. Does that cause the window to render once for each refresh of every involved monitor? So if you scroll a browser window spanning those two screens does it render at 60+75=135Hz? Arguably it should, otherwise you get judder/jerky scrolling, but it would lead to some crazy frame rates on more diverse setups, unless only parts of the window are rendered for the target monitor. But since Wayland clients aren't supposed to know their position this sounds hard to implement without leaking the window position through the "render mask"/scissors. If true, that would be one area where Waylands security bullshit would directly be in conflict with performance and the every frame is perfect philosophy. Since uneven frame times are just as bad as tearing. I you could even call Thema temporal tearing.
    If there is ever a truly "every frame is perfect" goal that want to eliminate even "uneven frame times", a render surface shall be able to contain multiple buffers, one for each monitor when compositor tell a client it's spanning multiple non-identical monitors. Wayland doesn't seem to have that. The need of multiple buffers arise again if one want perfect subpixel AA text drawn in mixed-DPI displays. But no, the Wayland team heads seems didn't understand that text cannot upscale/downscale well after rasterization nor any sort of mixed-display scenario.

    Rendering the whole surface twice is inefficient. But it may worth the time for static content, which is most of the GUI applications. Let legacy applications read only the highest spec among the monitors if they can output only one buffer. Everyone would have been happy. Since the client still don't know its screen position except which screen profiles it is spanning onto, Wayland security team can still claim they are not "leaking". Monitor frame rates and DPI are essential information for real time generated animation anyway.

    Leave a comment:


  • billyswong
    replied
    Originally posted by sinepgib View Post
    I see. Even then, should this be part of the compositing protocol or something transparent that each compositor handles? IOW, do clients need to know about this?
    From what I learnt so far, color management is becoming ugly.

    Framerate is easy. There is only one way to handle what to do when framerate of monitor and framerate of client mismatch. Compositor tells the client the maximum framerate monitor(s) can output. Client output frames to compositor. Compositor output old frame if client is slow or new frame if client is fast enough. If multiple surfaces update their frames in different rate, mix old frames and new frames. It is okay for client programs to ignore what framerate the montior(s) is/are, and output frame update signal at any time and frequency they want. (Thus 3D benchmarks can report a framerate higher than the highest framerate monitor in market.)

    DPI is hard. There are multiple ways to do scaling. Photos are usually best scaled smoothly and let blur happen. UI elements and text usually want to keep edges sharp, in the cost of lines being wider/thinner than default. Pixel arts like to treat each logical pixel a square, and keep diagonal lines saw-like. Hairlines want to stay 1dpx no matter how high the display dpi is. There are also image viewers / editors etc that will want NO scaling applied to their content portion. All these different ways accumulate deviation of total object size and impact the overall UI layout. (Such impact in text shouldn't happen if text engines can compensate that by adjusting character spacing / line spacing. But it is not done in most cases.) At the end, most compositors give up. They give a dpi value to the client program and ask the client program to return a scaled frame by themselves instead.

    Color management is hardest. Why? In the dpi case, the best way of scaling things are all knowable if GUI systems open up a channel and ask everybody that. It's only about where and when to ask throughout the system. But in color management, it is often a big unknown. When an image says its #FFFFFF is D65 white but a monitor is outputting (or claiming to output) D50 white for its brightest color, what should the computer do? There can be no consensus for that. Similar questions can be asked for black, red, green, blue etc. (Ink black on paper differs with LCD black differs with OLED black.) With the introduction of high dynamic range, the question of how to mix HDR and SDR content is even harder. Do the creators of HDR materials have in mind how should the materials be clamped or compressed in color / brightness range for all the different monitor / projector capacities? Probably not.

    In an ideal world, there should be a HDR/WCG color profile independent of traditional color profile. In such HDR color profile, it will provide a HDR white : SDR white ratio of source material and also a suggested transformation function to handle any different ratio from the output system. In reality, many HDR contents, especially movies, are using a color profile that encode only *absolute brightness*. Such kind of profiles should have been frown upon outside raw photo / video record, but well... Some HDR profiles have a standard SDR white value hardcoded in their specs, but it doesn't solve the transformation problem completely, as it is still unknown when to clamp, when to compress, and when the SDR part compress together with the HDR part.

    Originally posted by sinepgib View Post
    Yes, I can see why this wouldn't make sense. Even if it were feasible, which I don't think so, tripling and multiplying by 12 (!!!!!), respectively, the frame rate for something that can't and won't be shown in cases such as 144Hz+60Hz, it would be extremely inefficient. I can also see why in this case the compositor needs to know about this.

    I wonder how either problem can be handled when half a window is in either screen.
    For half on A and half on B scenario, we may either (a) give the higher spec request to client, and then downscale the quality for the lower spec monitor; or (b) give both spec to client, and give the client a chance to output 2 framebuffers. The former (a) is probably what most Linux GUI will go for. From what I observe, what Windows do is (c) give the spec of whichever monitor the cursor doing the window dragging is positioned at, and then output the overlapping part in another screen as-is with no scaling.

    Leave a comment:


  • binarybanana
    replied
    Wow, that's insane (the old behavior). If you sync to *one* CRTC and especially if it's for a full-screen game or whatever waiting for any other monitor is totally bonkers. There is *nothing* to sync! That only makes sense at all for windows that span multiple screens or in cloned mode. Did Keith Packard come up with such a retarded thing (he created Present), or what happened? I always believed the mixed refresh rate mess could be fixed by just correctly specifying the crtcs in the call to XPresentPixmap() depending on the window position. Ideally for monitor-spanning windows that pixmaps would be broken into pieces that each individually sync to the monitor they're on, either by the program, the compositor, or (maybe best) in the X server itself. Then only the target crtc/refresh rate is relevant for blocking/backpressure like this patch, but no edge cases that cause tearing. Unless there is more unexpected brain damage in there somehow. Looking at the size of this patch it could still be easy to fix more thoroughly, but this should be perfect except for edge cases.

    I wonder how does Wayland handles windows spanning mixed refresh rate monitors? Say one 60Hz and one 75Hz. Does that cause the window to render once for each refresh of every involved monitor? So if you scroll a browser window spanning those two screens does it render at 60+75=135Hz? Arguably it should, otherwise you get judder/jerky scrolling, but it would lead to some crazy frame rates on more diverse setups, unless only parts of the window are rendered for the target monitor. But since Wayland clients aren't supposed to know their position this sounds hard to implement without leaking the window position through the "render mask"/scissors. If true, that would be one area where Waylands security bullshit would directly be in conflict with performance and the every frame is perfect philosophy. Since uneven frame times are just as bad as tearing. I you could even call Thema temporal tearing.

    Leave a comment:


  • ssokolow
    replied
    Originally posted by sinepgib View Post
    I see. Even then, should this be part of the compositing protocol or something transparent that each compositor handles? IOW, do clients need to know about this?
    I think it'd depend on the application. For example, something like GIMP or Photoshop would need to integrate with it to make sure that the colour profile potentially embedded in the image gets mapped to the display's colour profile with minimal rounding error from intermediate conversion steps.

    Leave a comment:


  • sinepgib
    replied
    Originally posted by billyswong View Post
    If "display gamma control" is only for correcting a less-than-ideal monitor to output colors closer to sRGB, then yes, it shouldn't matter to compositing. But when there are all those fancy wide-color-gamut / high-dynamic-range source materials (image / video) and also such monitors in market, all having their own color space, color curve, dynamic range, and "intention" incompatible with each other, it may be unpractical for compositor to dictate a universal master color space for windows / surfaces to upscale to it, then convert them all back to monitor profile. (I heard such wishful thinking has already costed bad performance for video playing in hiDPI monitor in Linux in the past.)
    Originally posted by billyswong View Post
    Who knows if someone want to run the desktop UI in sRGB, then apply a different gamut A to image surface A, different gamut B to video surface B, with them overlapping 2 wide gamut displays, each with a different dynamic range (max brightness)? While Wayland can refuse to implement "display gamma control", just like how they outsource previous responsibilities of X to libinput / pipewire, the void of "display gamma control" is still there, waiting someone step up and fill it in.
    I see. Even then, should this be part of the compositing protocol or something transparent that each compositor handles? IOW, do clients need to know about this?

    Originally posted by billyswong View Post
    Another example is compositor need to know the refresh rate of each monitor for best experience when high frame rate / VRR monitors are mixed with standard monitors together. In theory, one can push the framebuffer to the speed of common multiple of all monitors, but in practice such high speed is either impossible or too power-hungry in comparison to the other approach.
    Yes, I can see why this wouldn't make sense. Even if it were feasible, which I don't think so, tripling and multiplying by 12 (!!!!!), respectively, the frame rate for something that can't and won't be shown in cases such as 144Hz+60Hz, it would be extremely inefficient. I can also see why in this case the compositor needs to know about this.

    I wonder how either problem can be handled when half a window is in either screen.

    NOTE: I split in several quotes to change the order of paragraphs, due to some questions being related to first and last while others to second.

    Leave a comment:


  • ssokolow
    replied
    Originally posted by billyswong View Post
    just like how they outsource previous responsibilities of X to libinput / pipewire
    Bad examples.

    libinput is a replacement for older, more hardware-specific input drivers that's also used by X.org. Describing it that way would be like saying that ffdshow "replaced" DirectShow with FFMPEG, when it just replaced the older mess of codecs that plugged into DirectShow.

    As for pipewire, it's "replacing" having applications talk directly to devices like /dev/video0 so in the purest sense, it's not replacing anything at all... just adding a new routing layer so applications don't need the video input equivalent of "every DOS game comes with its own pack of sound card drivers" to support every possible type of video source on Linux. (eg. You could use the PipeWire equivalent of pavucontrol to feed an application a screen recording or video streaming in over the network when it was only written with the expectation to read from a webcam.)

    Leave a comment:


  • billyswong
    replied
    Originally posted by sinepgib View Post

    Maybe it is a stupid question, but what does that have to do with window management and compositing? Isn't that more a task for display management?

    EDIT: also, who is supposed to take care of display management anyway?
    If "display gamma control" is only for correcting a less-than-ideal monitor to output colors closer to sRGB, then yes, it shouldn't matter to compositing. But when there are all those fancy wide-color-gamut / high-dynamic-range source materials (image / video) and also such monitors in market, all having their own color space, color curve, dynamic range, and "intention" incompatible with each other, it may be unpractical for compositor to dictate a universal master color space for windows / surfaces to upscale to it, then convert them all back to monitor profile. (I heard such wishful thinking has already costed bad performance for video playing in hiDPI monitor in Linux in the past.)

    Another example is compositor need to know the refresh rate of each monitor for best experience when high frame rate / VRR monitors are mixed with standard monitors together. In theory, one can push the framebuffer to the speed of common multiple of all monitors, but in practice such high speed is either impossible or too power-hungry in comparison to the other approach.

    Who knows if someone want to run the desktop UI in sRGB, then apply a different gamut A to image surface A, different gamut B to video surface B, with them overlapping 2 wide gamut displays, each with a different dynamic range (max brightness)? While Wayland can refuse to implement "display gamma control", just like how they outsource previous responsibilities of X to libinput / pipewire, the void of "display gamma control" is still there, waiting someone step up and fill it in.

    Leave a comment:


  • MrCooper
    replied
    Originally posted by whitecat View Post
    Nobody cares if Wayland is technically able to do that or not.
    The fact is that the end-user cannot set it up with the GNOME or KDE screen manager because there is no such option...
    It means it's possible for them to implement support for this at some point. If you care about this, I suggest filing a an issue at https://gitlab.gnome.org/GNOME/mutter/-/issues.

    Leave a comment:


  • sinepgib
    replied
    Originally posted by theriddick View Post
    Wayland doesn't even have display gamma control. Always makes me laugh....
    Maybe it is a stupid question, but what does that have to do with window management and compositing? Isn't that more a task for display management?

    EDIT: also, who is supposed to take care of display management anyway?

    Leave a comment:


  • theriddick
    replied
    Wayland doesn't even have display gamma control. Always makes me laugh....

    Leave a comment:

Working...
X