Announcement

Collapse
No announcement yet.

KDE's Nate Graham On X11 Being A Bad Platform & The Wayland Future

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by damian101 View Post

    No, HDR usually means PQ or HLG transfer curve. Which are definied in BT.2100, not BT.2020. Which has nothing to do with bit depth or color gamut.
    Wrong again moron. Better luck next time.

    Comment


    • Originally posted by microcode View Post

      As much as I enjoy working on software, when I have to use computers for productive activity, I can't be satisfied with something that is ‘improving’, nor with something that ‘will get there’. I'm going to use the thing that WORKS, until the other thing ALSO WORKS.
      Well tough. Just accept your apps defaults and Wayland also JUST WORKS. Just don't force apps that haven't declared themselves to be ready to use Wayland by default to use Wayland and you're good to go. Of course, you'll rarely find any app where you would actually run into any trouble if you did, but that way you just make really sure.

      Comment


      • Originally posted by Artim View Post

        Transaltes from german Wikipedia:
        Conventional SDR uses a color depth of 8 bits, which corresponds to a dynamic range of around 6 f-stops (64:1).[1] HDR video uses a color depth of 10 bits and thus achieves a dynamic range of up to 17.6 f-stops (200,000:1, on a corresponding output device with 2,000 cd/m²).



        No just fuck off you stupid moron. Jesus are you thick and incompetent. Your comment of a "HDR based transfer" to describe HDR doesn't make any sense, not even logically.
        congradulations, you linked a wikipedia article, please forgive me since im not german, and I really don't care about correcting a wikipedia source all that much, but the source I think they site is BBC for what you are referring to I believe, not some international standard, However even disregarding that, I highly reccomend you actually read the paper in question, Here is a direct quote from it defining the necessity of the transfer.

        Overall the conventional display gamma curve is not adequate for HDR reproduction and a different non-linearity is required​
        this is ignoring that this paper is very specifically about the HLG transfer. I would highly reccomend looking into the stuff you cite instead of blindly interpeting someone else's, in this case very bogus, interpetation
        Last edited by Quackdoc; 28 December 2023, 02:31 PM.

        Comment


        • No Artim, you are the ridiculous person (also using unnecessary personal attacks). I've been contributing to MATE's bug tracker and reporting issues as I find them, I found an issue in the MATE main menu and it is being worked on. You are assuming that people are not contributing when they are. There is a move by Redhat to disable X11 and force a broken default onto people. MATE's developers are not to blame for this, neither are the users. The issues I am talking about also exist under Wayfire which is meant to be a fully Wayland native environment. If the Wayland native environments are broken, and the non-native environments are broken, then what are users supposed to do? Forcing people into KDE/GNOME is not an acceptable move for a platform that has been entirely about user choice. The reasonable path forward is to continue to support packaging/running X11 while advocating for, and developing on Wayland. The move to Wayland is progressing, but it's still a while off of ready to be the default, and certainly nowhere near ready to be the only display manager offered in distributions.
          Last edited by DMJC; 28 December 2023, 02:34 PM.

          Comment


          • Originally posted by Artim View Post

            Then where is the HDR support in X? There is none and there never will, while KDE IS implementing it right now, as is Valve? Well, what a bad example.
            they're implementing it themselves despite it not landing in the wayland protocol. This is HDR support despite wayland, not HDR because of wayland. HDR because of wayland is down the road.

            Originally posted by Artim View Post

            Let me guess, you either work on crappy Nvidia hardware or are messing with settings you shouldn't? Because nobody else that isn't has problems worth mentioning.​
            Nvidia still has the majority of the market when it comes to hardware, it leads in raytracing and AI by a large margin. Their hardware isn't "shitty", its their software practices that are shitty.

            Originally posted by bug77 View Post
            There is, on Kwin+Wayland. I have ticked that box yesterday, it did something with the colors (couldn't say more than that, my monitors sucks at HDR).​
            See the above.

            Originally posted by andyprough View Post
            At least you acknowledge the truth - Wayland needs Xorg because Xwayland is nothing more than another Xorg xserver. And without Xwayland, your legacy application support is going to completely suck, which is entirely unacceptable in the enterprise and institutional settings. Think banks, governments, hospital corporations, militaries, stock exchanges, power plants, police agencies, etc, etc, etc. Regardless of the shriekings of the Wayland sycophants for the death of Xorg, Xorg is here to stay for a good long while, because Wayland can't survive without it.​
            Wayland needs X11*. Xwayland is an implementation of the X11 protocol, it most certainly is not xorg in wayland. Xorg is an implementation of the X11 protocol.

            Comment


            • Originally posted by Quackdoc View Post

              congradulations, you linked a wikipedia article, please forgive me since im not german, and I really don't care about correcting a wikipedia source all that much, but the source I think they site is BBC for what you are referring to I believe, not some international standard, However even disregarding that, I highly reccomend you actually read the paper in question, Here is a direct quote from it defining the necessity of the transfer.



              this is ignoring that this paper is very specifically about the HLG transfer. I would highly reccomend looking into the stuff you cite instead of blindly interpeting someone else's, in this case very bogus, interpetation
              HLG is not HDR, it can be used in HDR. But what HDR actually is is right there on page 7 of the text I translated for you. It's even already in english. So just read it yourself, you moron.

              Comment


              • Originally posted by avis View Post

                I dare you to find posts by distinguished Linux developers defending the things I've posted.

                I don't have a selective memory but you seem to have false memories. People did argue about this stuff. Developers? Nothing that I remember.
                How is this different from the Wayland situation? Most developers agree that Wayland is a superior platform to develop against than X11. When systemd exposed various overly generous assumptions made by init scripts and system services, systemd was "broken". When systemd 230 broke nohup and, consequently tmux etc., systemd was "broken". The suckless project has an entire write-up on why systemd is conceptually "flawed". DIstros like Void Linux, Alpine or Devuan spent resources on not having to ship systemd. And yet, the majority of developers today agrees that systemd is, ultimately, a good thing.

                As I have said before, the X11 to Wayland transitions affects more people in more significant ways than any other large architectural change ever had. It'd also argue that it is the largest architectural change that the Linux ecosystem has ever attempted. Hence the difficulties and the criticism, and the pushback, and the drama, and more pushback, and more drama fueled by the pushback. And on the backdrop of all of this nonsense, developers of the Wayland infrastructure are trying to resolve things in ways that won't rely on 1980s assumptions.

                Comment


                • Originally posted by Artim

                  It's really not. You might call it beta testing, but it's litterally using an official, yet still in staging protocol. That is not "despite Wayland" like with X where you can only have HDR when X isn't involved.



                  True. Though AI usually doesn't happen on Desktops, but on servers where you won't find any GUI and thus no Wayland. That's the reason they released part of their drivers as OS. They are still leading in AI, but Intel and AMD want to compete too and they are getting closer by the day. And AMD has ROCm, Intel has their OneAPI platform. Of course I can't tell how good they are compared to CUDA. But Intel and AMD only have to get close enough. If Nvidias perfomance benefits are eaten up by driver issues, people might just use something cheaper with a little less performance but more reliability.



                  In a few cases it does. But these cases are getting fewer by the day, the days of X are counted. Fedora is dropping X now, RHEL soon after. Others will follow suit.
                  No it's not using an official protocol, this is a protocol developed by the KDE and Valve devs for use until a proper protocol is completed upstream. (source, English - Planet KDE​) I get that things are moving fast, but that blog post is only 10 days old so I think it's still safe to say they aren't using anything official yet.

                  AI is used by gamers every day in the form of DLSS. Ray reconstruction too but that's much more niche. These are nvidia only features, but considering their marketshare it's safe to say AI is a staple for gamers playing anything recent. I certainly hope Nvidia gets dethroned, but that's because I dislike their approach to the open source ecosystem.

                  Also obviously Xorg is dying - but X11 in the form of Xwayland is here to stay. All the linux machines I administrate use wayland. My only real gripe with it is KDE's global menus don't work with GTK applications or libreoffice.

                  Comment


                  • Originally posted by Artim View Post

                    HLG is not HDR, it can be used in HDR. But what HDR actually is is right there on page 7 of the text I translated for you. It's even already in english. So just read it yourself, you moron.
                    Im not sure what you read, but again, read the PDF linked

                    Dynamic range is the ratio between the whitest whites and blackest blacks in an image
                    ...
                    Dynamic range is often measured in “stops”, which is the logarithm (base 2) of the ratio​
                    ...
                    the conventional SDR gamma curve (Rec 1886), and an alternative HDR, a perceptual quantisation curve (PQ 10K) defined in SMPTE ST 2084​
                    The PDF itself alludes many times that the requirement for HDR is the difference between the whitest whites and the blackest blacks, how many stops an image can show etc. These are defined by the transfer. Additional bitdepth grants you more data between stops, but it does not give you more stops

                    Standard dynamic range consumer television (8 bit video, e.g. DVD, SD and HD DVB) only supports about 6 stops of dynamic range, as discussed below. Professional SDR video (10 bits) supports about 10 stops. But the human eye can see up to about 14 stops (1) of dynamic range in a single image.​
                    ...
                    Some debate has confused high dynamic range with high brightness. The two are not the same. ... What high brightness does allow is to see high dynamic range without needing a very dark viewing environment.​
                    ...
                    It might be thought that higher dynamic range could be achieved by simply making displays brighter. But this is analogous to suggesting that you can increase the dynamic range of audio by turning up the volume. With audio, turning up the volume merely emphasises the noise. The same is true for video.​
                    ...
                    The useful dynamic range of video is determined by the ratio between adjacent quantisation levels.
                    Bitdepth by itself does not help produce an HDR image, You need to have an appropriate transfer to shape the light intensity along with appropriate bitdepth to prevent banding. It's also worth noting that Dynamic range and Gamut (IE bt.2020) are also two seperate things as stated in this article once again

                    it has the potential to deliver wider colour gamut (WCG), higher frame rates (HFR), and higher dynamic range (HDR)​
                    A high dynamic range image requires the appropriate range in between brightness, to get this, we require an appropriate transfer, PQ scRGB and HLG are HDR transfers, they give us the fine steps between brightness that SDR transfers like srgb, gamma2.2/2.4 etc dont give. scRGB is unique since it uses a linear transfer, but it brute forces dynamic range by just throwing a large bitdepth at the issue requiring 16bits or 32bits of data.

                    I cant read the wikipedia article, but either it is wrong, or your interpretation of it is.

                    Comment


                    • Originally posted by avis View Post

                      I dare you to find posts by distinguished Linux developers defending the things I've posted.
                      let's do it the other way around. I dare you find any distinguished linux developer who is *against* wayland.

                      Originally posted by avis View Post
                      I don't have a selective memory but you seem to have false memories. People did argue about this stuff. Developers? Nothing that I remember.

                      Pottering had a post where he admitted people had verbally assaulted him for creating systemd but did not posts long convoluted write-ups why systemd was better than SysVinit. The latter was just laughable.
                      You are kidding, right? Just start here: https://0pointer.de/blog/projects/systemd.html


                      Now... please, just stop. Stop trolling and grow up.

                      Comment

                      Working...
                      X