Announcement

Collapse
No announcement yet.

xf86-video-modesetting TearFree Page-Flipping Merged

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Monsterovich View Post

    the architecture of which was made for the ages.
    Sure, if you hardware is ages old. It is 2022 and Xorg desktop doesn't support mixed refresh rate on multiple monitors properly, dual gpu-s on laptops is still a mess especially when you have screen outputs wired to both igpu and dgpu and the whole glorified Xorg base idea is network transparency with doesn't work with modern stuff anyway and 99% of user don't care about it.
    Last edited by dragonn; 27 December 2022, 07:51 AM.

    Comment


    • #22
      Originally posted by dragonn View Post
      Sure, if you hardware is ages old. It is 2022 and Linux desktop doesn't support mixed refresh rate on multiple monitors properly
      Specially for you I made and connected an external monitor to the laptop, even on this low-end shitty iGPU two monitors work fine: 60hz and 120hz. Works as intended.



      Originally posted by dragonn View Post
      dual gpu-s on laptops is still a mess especially when you have screen outputs wired to both igpu and dgpu and the whole glorified Xorg base idea is network transparency with doesn't work with modern stuff anyway and 99% of user don't care about it.

      What's the problem with using a discrete card only? Why use a dummy GPU instead of a real GPU?

      Comment


      • #23
        Originally posted by Monsterovich View Post

        Specially for you I made and connected an external monitor to the laptop, even on this low-end shitty iGPU two monitors work fine: 60hz and 120hz. Works as intended.
        No, they don't work. You don't even have an idea what to test.
        All monitors do sync to the higher refresh rate ones. Run glxgears and no matter on what monitor you move it's window it will run and they highest refresh rate with in the case of having 144Hz and 60Hz will produce tearing on the 60Hz monitor because they don't much up.
        This isn't mixed refresh rate support, it clearly runs at one refresh rate.

        Originally posted by Monsterovich View Post
        What's the problem with using a discrete card only? Why use a dummy GPU instead of a real GPU?
        Again, clearly you didn't ever use a multigpu laptop in years. This is often just impossible, on many laptops the internal screen is hardwired to the iGPU and you can not change that. You can try run then Xorg with reverse PRIME render but that still uses then the igpu to display what the dgpu has rendered and because how stupidly hacky is PRIME extension in Xorg you need to put `xrand --auto` command into your DE and login manager autostart to enable then screen because Xorg doesn't by default enable the reverse PRIME screens and you end up with black screen...
        Also, modern iGPU are not a "dummy GPU" and using them when the dGPU is not need has many real world advantages like:
        - not wasting power and producing unnecessary heat with makes your laptop quieter
        - you can disconnect your laptop from AC and just go without having to relog/reboot to switch back to the iGPU

        For god sake, why are so many user so ignorant and can not think outside they own use case?


        Comment


        • #24
          Originally posted by intelfx View Post
          So, it was as easy as adding just one tiny option to the X server? And no Wayland was needed after all?
          xf86-video-intel/amdgpu/ati have supported TearFree for around a decade, so the fact that it's possible isn't exactly news. The fact that it's taken so long for the modesetting driver to support it says a lot about the remaining level of developer activity for that driver and Xorg in general though.

          It's not like TearFree is equivalent to a Wayland compositor though; in particular, TearFree cannot 100% reliably prevent tearing-like artifacts in all circumstances, since it cannot ensure that drawing requests are grouped together as needed for a consistent visual appearance. When I was working on TearFree support in xf86-video-amdgpu/ati, I always thought of it as a "poor man's Wayland compositor".

          Originally posted by awesz View Post

          xrandr --output DisplayPort-0 --set TearFree on

          Only xf86-video-amdgpu/ati support this, not even -intel yet AFAICT.

          Comment


          • #25
            Originally posted by dragonn View Post
            No, they don't work. You don't even have an idea what to test.
            All monitors do sync to the higher refresh rate ones. Run glxgears and no matter on what monitor you move it's window it will run and they highest refresh rate with in the case of having 144Hz and 60Hz will produce tearing on the 60Hz monitor because they don't much up.
            This isn't mixed refresh rate support, it clearly runs at one refresh rate.
            And how do you propose to fix this, at least in theory? Even Windows hasn't fixed this problem. The screen refresh rate there is limited to the lowest supported refresh rate on all screens. The window should sync to the same refresh rate anyway. Even if you make the sync screen switch to another screen when you move the window, then what about if the window is on two screens at the same time?

            Originally posted by dragonn View Post
            Again, clearly you didn't ever use a multigpu laptop in years. This is often just impossible, on many laptops the internal screen is hardwired to the iGPU and you can not change that.


            It's a shitty limitation of the laptops. I don't like it either but Xorg is not to blame.

            Originally posted by dragonn View Post
            Also, modern iGPU are not a "dummy GPU" and using them when the dGPU is not need has many real world advantages like:
            - not wasting power and producing unnecessary heat with makes your laptop quieter
            - you can disconnect your laptop from AC and just go without having to relog/reboot to switch back to the iGPU​​
            ​​

            In fact, you can simply force the lowest GPU frequency. I don't think that using iGPU instead of dGPU is a good way to save battery power.

            Originally posted by dragonn View Post
            For god sake, why are so many user so ignorant and can not think outside they own use case?
            ​​​

            As if Wayland fanatics think about users and covering all use-cases of Xorg users? Also, you talk about things and have a poor idea of how they should work, what the downsides might be, and so on.

            Comment


            • #26
              Originally posted by dragonn View Post
              For god sake, why are so many user so ignorant and can not think outside they own use case?​
              Befitting question to all Wayland clowns who think "capturing keystrokes" is a security bug and not a feature, just because they use their PC like a mobile kiosk.

              Comment


              • #27
                Originally posted by Monsterovich View Post

                And how do you propose to fix this, at least in theory? Even Windows hasn't fixed this problem. The screen refresh rate there is limited to the lowest supported refresh rate on all screens. The window should sync to the same refresh rate anyway. Even if you make the sync screen switch to another screen when you move the window, then what about if the window is on two screens at the same time?
                That is an edge case with doesn't happen that often, most user will not have they window all the time over screen edges. Clearly wayland did manage to fix that in normal situations.

                Originally posted by Monsterovich View Post
                It's a shitty limitation of the laptops. I don't like it either but Xorg is not to blame.
                No, Xorg i to blame for that. That is like 90% of gaming laptops on the market that have that limitation. You can say they are shitty, doesn't metter the software has to deal with it. And that is simple because software has always to bend to the rules of hardware, not the other-way around (just because hardware is expensive and you can not just replace it with easy).
                ​​
                Originally posted by Monsterovich View Post
                In fact, you can simply force the lowest GPU frequency. I don't think that using iGPU instead of dGPU is a good way to save battery power.
                You think and I just know you are plain wrong because I driver a ASUS Zephyrus G14 daily since almost 3 years.
                Using the iGPU as much as possible is the most important think to do on a laptop when running of battery. Even keeping the dGPU on on the lowest GPU frequency will cut you battery life by at least 40%.
                Do you really think CPU manufactures would bother adding iGPU to they mobile CPU-s for gaming if it wouldn't make any sense to use them?
                ​​​
                Originally posted by Monsterovich View Post
                As if Wayland fanatics think about users and covering all use-cases of Xorg users? Also, you talk about things and have a poor idea of how they should work, what the downsides might be, and so on.
                I am not a wayland fanatics, not at all. I still use xorg as may daily driver because I haven't still found a good replacement for awesome wm.
                Is wayland perfect and ready to use for every one? No, not at all and it still will take a lot of time before all lacking hard to do stuff is done.
                But is Xorg fundamentally broken in it design and just need to be replaced with something that does break fully backward compatibility? Yes, definitely.
                Will that be Wayland? Yes, definitely, it way to late to step back and start over and since it clearly does provide advantages over Xorg. Look how much time it did take for Wayland to get to the state it is now? And that is with most big parties involved and mostly agreed upon what is the right way. No X12 will happen and break through.

                Comment


                • #28
                  Originally posted by Weasel View Post
                  Befitting question to all Wayland clowns who think "capturing keystrokes" is a security bug and not a feature, just because they use their PC like a mobile kiosk.
                  But you know this is something that is worked on too? Wayland will have global shortcuts from user apps, they just want to do it in a more secure way.
                  If you need that now just don't use Wayland, no one forces you too yet. And at some point that will be a feature and then you can peacefully move over on Wayland.
                  Disabling vsync for games on Wayland is coming too.

                  Yes some thinks in Wayland with sound trivial do take a lot of time, sometimes some devs don't get what the user clearly wants and refuse to add it at first (disabling vsync) but they more parties are involved then more gets done and then more "we need that function even if you don't agree with it" get's through.

                  Comment


                  • #29
                    Originally posted by dragonn View Post
                    But you know this is something that is worked on too? Wayland will have global shortcuts from user apps, they just want to do it in a more secure way.
                    If you need that now just don't use Wayland, no one forces you too yet. And at some point that will be a feature and then you can peacefully move over on Wayland.
                    Disabling vsync for games on Wayland is coming too.

                    Yes some thinks in Wayland with sound trivial do take a lot of time, sometimes some devs don't get what the user clearly wants and refuse to add it at first (disabling vsync) but they more parties are involved then more gets done and then more "we need that function even if you don't agree with it" get's through.
                    Ok, what about smart hotkeys? I'm talking about actual power user scripts using stuff like AutoHotkey on Windows (works on Wine too, but only with Wine apps)?

                    A "smart" hotkey is one that monitors input and acts according to multiple criteria, not just a specific hotkey combination. Heck it might as well not even be a hotkey to begin with. People have been using, for example, auto fills when writing emails with such personalized scripts (checks last input X characters if they match specific words, AND the email app is active and the text field is the one which has focus, etc, that's what's "smart" about it). Not me though.

                    I do use it for productive apps. For example there's an advanced audio spectral editor that I use which lacks a function to fill in spectral gaps from below or above the respective hole. You can do it manually by exporting a selection, ring modulating the exported selection (to create a mirror above/below the source), isolate one part of it (either above or below, whichever you want), and re-importing it into the editor, but it's extremely tedious, especially because you have to calculate frequencies manually to know how much to ring modulate it (how much to move it up/down).

                    Instead, I wrote an "evil monitoring" script that took me about a day (yes, it's quite complicated) that does all this, including sending keystrokes (can that even be done in Wayland?) and so on. And it's activated in smart manner but only when the editor is active of course, so it won't do shit if I alt-tab and change application to do something else. It does everything using built-in features on the editor, exporting selections, processing it in external app (command line application called SoX), then re-imports it into the editor. Now to fill a gap it takes about 2 seconds instead of a minute! It's extremely productive so it was worth spending a day on it.

                    This is just one example that I use a LOT so. Please don't talk to me about people's use cases. Wayland is oblivious to anything a power user would want.

                    You're going to say, well, why not give such feature to the editor?

                    This is the real world. We don't always have what we want. That's why we automate and personalize our lives as power users. It's not open source, anyway.

                    But yes, you can see, listening to keystrokes is "very evil" indeed!!! Damn, saving me so much time must be evil. Who could possibly want to listen to keystrokes other than malware, right?
                    Last edited by Weasel; 27 December 2022, 11:02 AM.

                    Comment


                    • #30
                      Originally posted by dragonn View Post
                      That is an edge case with doesn't happen that often, most user will not have they window all the time over screen edges. Clearly wayland did manage to fix that in normal situations.
                      In Wayland, each fix is accompanied by 3-5 limitations. Sometimes the fixes are made through blatant crutches.

                      Originally posted by dragonn View Post
                      No, Xorg i to blame for that. That is like 90% of gaming laptops on the market that have that limitation. You can say they are shitty, doesn't metter the software has to deal with it. And that is simple because software has always to bend to the rules of hardware, not the other-way around (just because hardware is expensive and you can not just replace it with easy).


                      So 90% of laptops are crap. I'm not surprised, because laptops are basically toilet paper in the computer world (Smartphones are even worse
                      ​!). They shouldn't be, but that's what the market decides. Not to mention that the term "gaming laptop" is an oxymoron.

                      Originally posted by dragonn View Post

                      ​You think and I just know you are plain wrong because I driver a ASUS Zephyrus G14 daily since almost 3 years.
                      Using the iGPU as much as possible is the most important think to do on a laptop when running of battery. Even keeping the dGPU on on the lowest GPU frequency will cut you battery life by at least 40%.
                      Do you really think CPU manufactures would bother adding iGPU to they mobile CPU-s for gaming if it wouldn't make any sense to use them?​
                      ​​

                      Sounds like poorly designed power management. Perhaps in both software and hardware.

                      Originally posted by dragonn View Post
                      Will that be Wayland? Yes, definitely, it way to late to step back and start over and since it clearly does provide advantages over Xorg. Look how much time it did take for Wayland to get to the state it is now? And that is with most big parties involved and mostly agreed upon what is the right way. No X12 will happen and break through.
                      ​​

                      Wayland has even bigger and worse design flaws than Xorg. And the developers don't even want to acknowledge them because of their stupidity and arrogant stubbornness. This is already a clinical case.

                      Comment

                      Working...
                      X