Announcement

Collapse
No announcement yet.

30-bit Deep Color For GNOME On Wayland Will Likely Take Some Time

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • caligula
    replied
    Originally posted by pal666 View Post
    i can assure you that multithreaded gui toolkits can't work. toolkit should be single threaded, but it should do only gui-related stuff, all heavy lifting(which is user code) should be submitted to other threads. so toolkit's single-threadedness in the right thing, application itself should be multithreaded
    I guess the biggest issue is, as the code base grows, at some point it becomes hard to guess when your application logic is "slow enough" to be offloaded to another thread. It might happen gradually. It's easy if you do I/O bound stuff, but if your slow code is CPU bound, you'll need a profiler. It's also somewhat vague - on modern machines, GUI toolkits typically offer lots of headroom for slow user code, yet they can only do some N amount of instructions per GUI frame. So they're real-time systems with huge amounts of extra CPU power. If one wanted a disciplined approach to GUI building, one would probably need some non Turing complete language where you can calculate the total number of steps some application logic code can take. That way the compiler could already tell at compile time if some application logic code is too slow to run on a system with some predefined specs.

    Other than that, having multiple threads doing the core GUI logic just complicates things and won't provide any extra functionality one can't have with just one thread + explicit allocation of new worker threads.
    Last edited by caligula; 23 August 2020, 12:01 PM.

    Leave a comment:


  • gedgon
    replied
    Originally posted by bug77 View Post

    It's not easy to get banding using 8bpc, [...]
    It's super easy with only 256 shades. Just look at GNOME Shell full screen polkit authentication dialog. Looks like crap in 8-bit.

    Leave a comment:


  • TheLexMachine
    replied
    Originally posted by aufkrawall View Post
    Putting efforts into 10 bit presentation support is a waste of time unless later HDR support can profit by it, as you can dither and won't see grain nor banding with 8 bit output anyway.
    I'd rather see more efforts spent directly on HDR support.
    HDR isn't possible without standardization and nobody has made any headway into that. The companies working on it all have their own working approaches, but no consensus on how/what to do what needs to be done to implement it in future things like Wayland.

    Leave a comment:


  • pal666
    replied
    Originally posted by mdedetrich View Post
    Which is the problem, having the C library implementing some GUI/UI toolkit as single threaded is causing the issue. The whole point is that GUI/UI toolkits should be multithreaded/asynchronous to provide optimal experiences for the user.
    i can assure you that multithreaded gui toolkits can't work. toolkit should be single threaded, but it should do only gui-related stuff, all heavy lifting(which is user code) should be submitted to other threads. so toolkit's single-threadedness in the right thing, application itself should be multithreaded

    Leave a comment:


  • Timo Jyrinki
    replied
    FWIW I've now 1.5 days of running mutter with this patch on openSUSE Tumbleweed and I haven't seen anything odd. So just in case you want to play with it or know something to check with it
    Code:
    zypper ar https://download.opensuse.org/repositories/home:/tjyrinki_suse:/branches:/openSUSE:/Factory/openSUSE_Factory/home:tjyrinki_suse:branches:openSUSE:Factory.repo
    zypper dup --allow-vendor-change
    (no other packages in that repo besides mutter)

    I just happen to have a general interest in making using my monitor's capabilities so I try patches every now and then. Don't trust on that repo being there indefinitely, so maybe clean up in a week or so.
    Last edited by Timo Jyrinki; 19 August 2020, 09:00 AM.

    Leave a comment:


  • bug77
    replied
    Originally posted by caligula View Post
    Sure, but I think 10b has a different meaning in the context of HDR monitors. Photography professionals are quite happy with < 250 cd/m² panels while the 10b HDR folds expect something even as high as 1000 cd/m². In photography the goal is to cover a larger color gamut with higher precision (although 10b per channel doesn't even imply a larger gamut than sRGB), but it seems for the HDR gamers a large brightness/contrast range is more important.
    HDR is a wide color gamut + enhanced contrast. 10bpc makes banding less likely when using a wider range of colors, even without using the enhanced contrast/brightness of HDR. Color brightness and luminosity are essential aspects of HDR, but they're orthogonal to each other.

    What I was saying is, spurred by HDR, now we can use 10bpc on the (Windows) desktop if we have a 10bpc monitor, even if it's not HDR capable.

    Leave a comment:


  • caligula
    replied
    With the advent of HDR, we got consumer monitors with 10bpc support and so, after decades, this feature was also opened up in the drivers for consumer GPUs.
    Sure, but I think 10b has a different meaning in the context of HDR monitors. Photography professionals are quite happy with < 250 cd/m² panels while the 10b HDR folds expect something even as high as 1000 cd/m². In photography the goal is to cover a larger color gamut with higher precision (although 10b per channel doesn't even imply a larger gamut than sRGB), but it seems for the HDR gamers a large brightness/contrast range is more important.

    Originally posted by bug77 View Post
    Not really an issue, you can't actually spot FRC with the naked eye. Native 10bpc is probably required for mastering content, but for consumption 8bit+FRC is really not an issue.
    Probably true. It was possible to spot FRC in dark areas when I still had a 6b + 2b TN panel, but it's a lot harder now that both pixel density is higher and the color tone differences are smaller.

    Leave a comment:


  • bug77
    replied
    Originally posted by aufkrawall View Post
    Then it's not rendered in >8 bit and dithered down to 8 bit, i.e. done wrong.
    Whatever the reason, as long as it happens, I'll take 10bpc support, thank you.

    Originally posted by caligula View Post
    The fact is, monitor manufacturers are integrating 10b color support in their devices. You're not supposed to do some 10b -> 8b conversions on the GPU. The GPU/computer should just deal with 10b colors.
    Actually, 10bpc has been with us for a while. It was just confined to professional equipment (including video cards and drivers). With the advent of HDR, we got consumer monitors with 10bpc support and so, after decades, this feature was also opened up in the drivers for consumer GPUs.
    So I can excuse open source developers for not getting support into shape till now, but now that it's not prohibitively expensive to do it, we should get the option of a complete 10bpc pipeline, like you said.

    Originally posted by caligula View Post
    Something like 90% of 4k screens already support 10b colors. Sadly, only the top 10% do it natively without FRC.
    Not really an issue, you can't actually spot FRC with the naked eye. Native 10bpc is probably required for mastering content, but for consumption 8bit+FRC is really not an issue.

    Leave a comment:


  • sandy8925
    replied
    Originally posted by caligula View Post

    The fact is, monitor manufacturers are integrating 10b color support in their devices. You're not supposed to do some 10b -> 8b conversions on the GPU. The GPU/computer should just deal with 10b colors. Something like 90% of 4k screens already support 10b colors. Sadly, only the top 10% do it natively without FRC.
    Ah, people discussing 8 bit FRC for HDR while my SDR monitor does 6 bit FRC + dithering.

    Leave a comment:


  • nranger
    replied
    Originally posted by bug77 View Post

    ... Also, did I read the article right or is the fix really moving away from GBM and towards EGL?
    No, GBM is not going away. EGL is used by applications (including compositors like Weston) that speak OpenGL ES. Mesa uses GBM to abstract the various ways gpu drivers provide buffers, and in turn GBM implements EGL.

    There are some GBM specific ways EGL is implemented, and it doesn't support the GL extensions Nvidia champions, EGLStream and EGLDevice, which is what you may be thinking of.

    Leave a comment:

Working...
X