Announcement

Collapse
No announcement yet.

KDE Saw Many Bug Fixes This Week From KWin Crashes To Plasma Wayland Improvements

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by TemplarGR View Post

    Again, a lot of blah blah blah blah. I am tired of this. You are presenting niche use cases as the norm. You are completely delusional. Who in their right mind runs LXDE with mainly KDE apps? Why use LXDE in the first place?

    So you need a lightweight, barebones, desktop environment, but proceed to bring the KDE bloat to it anyway?

    And all this just to pretend KDE apps should be considered a "different thing" than the KDE desktop. All this because you need to call it "Plasma 5" instead of "KDE 5" because you have vanity issues....
    You know Linux in itself, distro/DE choices aside is considered niche already right? Silly issue to raise as not being the norm. Why use Linux in the first place?(that's rhetorical, you don't need to answer that)

    KDE bloat? What exactly do you consider bloat? Last I knew that was something older versions of KDE had a bad reputation for regarding memory usage, but nowadays GNOME had taken that crown with it's poor performance and much higher memory usage. I understand there's been a lot of work to fix that now, so it probably isn't using over 1GB anymore and the compositor performance for things as simple as displaying the app menu/launcher should no longer be sluggish?

    If bloat to you is in number of packages, that's a silly definition of bloat. If it's in file size, that's fair if you can quantify it as an actual concern, it likely uses a small portion of the total disk space available, so whatever the added filesize weight would be it's doubtful that it's a serious concern.

    I used Gnome until around 2016 and switched to KDE, had too many issues with Gnome and it got better(but not perfect with KDE). The apps were often much nicer, if I were to go back to Gnome in future, I'd probably still use quite a few of the apps I've enjoyed that are maintained by KDE devs.

    ---

    As for the KDE vs Plasma thing, who cares. If you want to specifically talk about the desktop environment, people call it Plasma these days, but if you want to call it KDE and the context of whatever you're saying is rather evident that you're referring to the DE/Plasma, it really doesn't matter.

    Adding a number though is a bit different, it's pretty clear that the version number is associated to Plasma now, while the other parts of KDE have their own respective version numbers(Frameworks retains 5.x currently, but point releases are monthly iterations, not in sync with Plasma 5.x releases, they're separate now, Applications where the actual apps are is under a year.month versioning scheme, again unrelated to a specific version of Plasma). So if there's a Plasma 6.x release, there could be a 6.x release for Frameworks, but Applications would not be adjusting it's versioning, thus it wouldn't fall under a KDE 5/6/7 would it?

    Windows is mostly referred to as Windows, versioning is more specific and not sequential, Windows Vista, Windows 95, Windows 2000, Windows 7/8/?/10(let's skip 9 because we feel like it). In fact Windows 10 is intended to keep that number and be a rolling release of sorts, you instead get new releases/updates with a similar year/month versioning to reference them as is common with distros like Ubuntu or KDE Applications. If it helps, just say KDE Plasma 5.18, it's not really KDE 5 despite what past releases used prior to the change over.

    Nothing wrong with just calling it KDE though like you would Windows or macOS(which is another story). Just bringing version/series numbers isn't really relevant anymore going forward.

    Comment


    • #42
      Originally posted by polarathene View Post

      Out of curiousity, why is it important to you for a DE to have such long uptime? Is there a problem with restarting/shutting down? Especially with modern hardware this generally doesn't take long at all.

      ---
      It's just how I work I suppose - I do consulting for various clients at a time, so I get in a habit of having documents from libreoffice and gedit open with various configs or inventory data to reference quickly, 5-6 different browsers full of tabs for each customer, various IDE/Editors - why the multiple monitors. I also tend to run different vm system/appliances, so uptime is somewhat important as a server would be for all these reasons. Probably a bit of abuse of the term "desktop", but I've largely worked this way for 15 years with linux desktops full-time.

      Newest KDE seems to have gotten better about putting windows back where they were, but it has been quite fond of jumbling all my windows randomly, infuriating me in the process as per my first comment how I work. Even worse it was pretty dumb about it, and would take 3x 4k displays worth with dozens of app windows, and move them all the very far right-most display and window in doing so. As of 5.18.3, it doesn't seem to do that at very least anymore, but had on my prior version until recently upgraded my arch build.

      Why does it jumble my windows? Display hotplug removal detection. I use 3x 48" tv's as my "monitors" for my desktop, and in doing so, they don't behave like a vga display would in powering down. When they power down, the pc no longer sees them on the wire, and tries to move all my windows - terribly. I get this commonly with my thunderbolt dock using my laptop on the same displays as my desktop, as sometimes it'll lose them even when the displays stay on, but the os powers them off.

      Hardware + software quirks == annoying. I never have these display removal or window movement issues when using Cinnamon or Mate, though they do come with other quirks.

      Compositing between desktops is a hugely varying thing, particularly in stability and performance. Kwin simply cannot handle 11520x2160 for more than a few days without coming unglued to the point it'll crash. It also uses an absolute ton of cpu, memory, and gpu in doing so too. My desktop had 20 cores/128gb of ram, my laptop has 8 cores/64gb ram, it isn't for a lack of resources and it would still destabilize within days as would a memory leak, but worse. I can disable kwin, but losing features there causes me other annoyances, mostly cairo-dock without compositing causes banding on my displays and overlay issue now.

      Cinnamon compositing works pretty well lately if a little choppy with videos, only something (maybe it?) was causing a hard-lock every few days I never could figure out, so back to kde for now...

      I use my laptop now with nvidia and intel gpu's, I just leave the nvidia without drivers, everything else proved unstable years ago last time I tried. I had to abandon my desktop as the 1070 video card would only load in ES mode with drivers out of nowhere (failure?), breaking any graphics loading, and never did figure out why. My track record with nvidia has not been great the past few years, starting to consider amd again as a gpu solution.

      As the great equalizer in all this, I got sick enough of to dual boot up to windoze10 that came on my laptop, and actually try to use it for a bit. It couldn't deal with the resolutions either in either stability or choppiness on-screen, or even let me bring up fully the 3x 4k displays. Suddenly I'm a bit less angry at the DE linux counterparts now, but still, it would be nice to see some of these quirks addressed vs. frivolous new features, as I and dozens of others have posted bug reports for years about this, myself back as far as 4.x kde days.

      Comment


      • #43
        Originally posted by mikus View Post
        It's just how I work I suppose - I do consulting for various clients at a time, so I get in a habit of having documents from libreoffice and gedit open with various configs or inventory data to reference quickly, 5-6 different browsers full of tabs for each customer, various IDE/Editors - why the multiple monitors. I also tend to run different vm system/appliances, so uptime is somewhat important as a server would be for all these reasons. Probably a bit of abuse of the term "desktop"
        Sounds like you should be running a bare server not a DE as the host system. That'll have no issues with uptime, and you can keep your services/appliances running via VMs no issue. The other use cases that need DE with GUI apps would be served well with a VM per client you have. If it's a driver issue, perhaps a virtual video driver like VMWares might be more stable for you, or you can more easily use whatever DE/OS suits you best on a per case basis.

        For most apps, traditional VM setups should be sufficient, you can get fullscreen and multi-screen going where each screen is a window until you set it to be fullscreen, that should work around the TV display state issues that cause you problems with windows being shuffled around, since they'll all still be contained in that specific window like a group.

        For any case where you need more native like graphic performance, there is VFIO(see r/vfio for a community around this that's quite helpful, Arch Wiki has plenty of juicy info too on getting setup). This isn't just for graphics though, you can get near native/host performance in your VM guests for practically any part of the system. Generally this is slicing up the system resources however as you pass them to the guest to use exclusively(eg you can assign cores/threads just for that VM, disks, GPU, memory allocation etc), sort of like running multiple host OS in parallel.

        This does have a drawback that the VMs lose certain state features like being able to save the VM state/snapshot, it ends up like a host system where you can suspend/shutdown/hibernate it instead only(depending on the resources). If you pass through a GPU(Intel iGPU can have it's resources split across VMs, whereas a dGPU like nvidia would be a full passthrough), then instead of virtual displays, you get direct display output from the GPU like a host system, that display is no longer shared with anything else(unless you have multiple inputs to it and a way to toggle/cycle through them).

        Splitting up your work this way would allow you to be a bit more flexible, so that if there is a compositor issue, it's more localized to a smaller scope that doesn't affect the rest of your system, and you might be in a better position to fix it more easily(such as a reboot of the VM). It'd be useful for me as well, I definitely don't need all my browser windows/tabs and apps open 24/7 for variety of projects munged together, so it'd provide better separation of my projects...but I've yet to actually sort this out. It's better these days as the issues I had when first looking into this kind of setup was related to file sharing/access being a bit annoying, now there are things like virtio-fs that I think better address this.

        It's very common for Windows OS to be run in r/vfio, there's a software called Looking Glass which allows for using the GPU passthrough approach but displaying those screens in a resizable/movable window like a traditional VM, yet more efficient than VNC as instead of network screen capture it writes to RAM from the guest and the host(or another guest) reads from that same shared memory to display it. Only captures for Windows though I think, no Linux guest support. You can also combo that with display dummy plugs, these use a display output and emulate a connected display of whatever resolution/framerate, then you can have have your host connected displays share the same display device physically as your guest VM, but powered by two different OS and GPU.

        One other benefit of all this, is with Intel at least, there is a live migration feature that can send one VM state to another machine, eg from desktop to laptop or vice versa. So long as the resources/requirements are sufficient for migrating the VM across the two systems. The base image can be on both systems, and you're just transferring the state, so it's not necessarily as big of a transfer as it sounds once setup.

        Originally posted by mikus View Post
        Newest KDE seems to have gotten better about putting windows back where they were, but it has been quite fond of jumbling all my windows randomly
        I'm always working with fixed displays, I don't change the display count on a system so I haven't experienced that. I do remember reading about something like that getting attention in past months, so perhaps it arrived with Plasma 5.18 or a recent monthly KDE Frameworks update?

        Originally posted by mikus View Post
        It also uses an absolute ton of cpu, memory, and gpu in doing so too. My desktop had 20 cores/128gb of ram, my laptop has 8 cores/64gb ram
        I only recall CPU usage going high when kwin has fucked up, that's when it's not able to recover properly and problems start to occur. With the current desktop system for example, I can't seem to view 3D content in sketchfab, it cites some compatibility error and goes to 360 image rotation mode, should resolve itself once I am ready to restart the system. But when kwin is no longer using the GPU properly, it will use more CPU for anything it's doing like window resizing/moving iirc.

        Can also happen if updating the system kernel I think, maybe GPU drivers too(nvidia). I know that my system won't restart/shutdown via GUI methods after some updates due to this(the old kernel that is running has had it's modules deleted, and it can't find the nvidia driver or whatever, which usually causes other problems for anything that wants to use the GPU). While sketchfab wouldn't work, some other WebGL demo I tried last night on this system lagged horrendously, all my CPU was on full load and it was struggling to render a few frames a second, I think it will handle that much better after a restart.

        My laptop with only 2 cores i3 CPU and 4GB RAM, no dGPU only the Intel iGPU can boot with ~500MB of RAM in use, and 0-1% CPU idle, desktop is probably similar, but I know it gets worse over time, especially when kwin fails and compositing takes a dive as a result.

        Originally posted by mikus View Post
        My track record with nvidia has not been great the past few years, starting to consider amd again as a gpu solution.
        Each option has issues afaik. Nvidia gets a lot of shit, but despite all the praise AMD gets, there's been plenty of cases when it's been pretty bad too. Sometimes it's specific GPU models/products that you need to do a bit of research into(harder if it's fairly new product), other times it's just needing newer kernel/mesa, or waiting several years. If you don't need much vRAM or GPU grunt, going with older GPU are apparently good, not too old though(I got an R240 or something, budget GPU, but it was one generation too old to benefit from something that I wanted/cared about). RX580 is apparently decent these days, might serve you well?

        Do note that on laptops, AMD is only about to get the power saving feature support PSR for laptop displays(provided your display uses eDP 1.3 or higher iirc, I bought a laptop end of last year which while a new 2019Q3 release, used a 2017 manufactured display and eDP 1.2 which was from 2011? eDP 1.3 came out a year later, so I lucked out thinking the 10th gen Intel CPU and WiFI AX was new enough that surely they'd not skimp on display tech that old considering the benefits). AMD gets this support with it's drivers coming in the 5.7 kernel afaik, intel has had it for a long time(not relevant to nvidia as the display for laptops is usually handled by intel and nvidia does some interaction with intels framebuffer to route it's output afaik).

        AMD might turn out better for you, just fair warning that it's not always great despite what the community tends to imply.

        Originally posted by mikus View Post
        As the great equalizer in all this, I got sick enough of to dual boot up to windoze10 that came on my laptop, and actually try to use it for a bit. It couldn't deal with the resolutions either in either stability or choppiness on-screen, or even let me bring up fully the 3x 4k displays.
        Could be due to hardware limitations? I don't know your particular setup or device capabilities, but I know from stuff like USB that there is a lot of gotchas despite whatever marketing claims, where they cite the protocols specs, but things like chipset and cable quality, the power supplied to that chipset from the device, and the target device(s) you connect to all contribute to what actually is not only supported but capable of. If the problem is consistent across OS and it seems like it shouldn't be an issue, it's probably due to hardware then.

        A good example that is simpler to demonstrate that is wifi devices, despite all their other variables, they can claim Wifi 802.11N support, an astounding 150Mbps(less than 20MB/sec), even if all other variables were perfect for utilizing that bandwidth limit, the product only has to claim 802.11n support, not actually deliver that performance, similar can be seen with disk drives with poor performance but marketing themselves as SATA 3 with 6Gbps(~600MB/sec, less with SATA overhead, less again with USB overhead if an external).

        Comment


        • #44
          Originally posted by polarathene View Post
          Sounds like you should be running a bare server not a DE as the host system. That'll have no issues with uptime, and you can keep your services/appliances running via VMs no issue. The other use cases that need DE with GUI apps would be served well with a VM per client you have. If it's a driver issue, perhaps a virtual video driver like VMWares might be more stable for you, or you can more easily use whatever DE/OS suits you best on a per case basis.
          I also have a long-lived desktop. Partly for similar reasons, but partly because:
          1. I use my PC as my music player to fall asleep to, and my alarm clock
          2. I leave a *lot* of stuff open and organized so I can just sit down and get back to things in the morning
          3. My boot time is reasonably quick, but the pile of stuff I have to get running once I log in takes several minutes to start up.
          Aside from considering a session-killing crash unacceptable (I'd sooner try to run my entire desktop inside Xpra), I just hate the amount of work involved in getting everything laid out after a session restart. (I've been using KDE since 3.x. The session restore has always been such a flaky mess in combination with the applications that I use, such as gVim, that it's preferrable to just launch a fixed list of applications on startup.)

          Comment


          • #45
            Originally posted by ssokolow View Post
            I also have a long-lived desktop. Partly for similar reasons, but partly because:
            VM solution to partition the organized setup you have is probably useful too then I guess? It is a bit of a hassle to approach and get setup though I guess, I hope to eventually get around to trying it in my workflow in future as my usual reason to need a reboot is due to memory build up from web browser usually or other things that seem to leak memory over a long enough period even if mostly idle.

            The hassle of setting everything backup is one I can relate to, especially with projects where I've got work spread over several apps and have to get everything back to that. Or with web browser, it isn't able to return the restored windows back to assigned virtual desktops for organization, so I've got to track the windows with a tab(I just have the tab use an image of a number from google image search so that I can easily identify it's group in the overview/expo previews. Terminals are also an annoyance, though I think restoring their individual sessions/history is possible, I just haven't looked into that yet, VMs would resolve most of this though.

            Comment


            • #46
              Originally posted by polarathene View Post

              VM solution to partition the organized setup you have is probably useful too then I guess? It is a bit of a hassle to approach and get setup though I guess, I hope to eventually get around to trying it in my workflow in future as my usual reason to need a reboot is due to memory build up from web browser usually or other things that seem to leak memory over a long enough period even if mostly idle.

              The hassle of setting everything backup is one I can relate to, especially with projects where I've got work spread over several apps and have to get everything back to that. Or with web browser, it isn't able to return the restored windows back to assigned virtual desktops for organization, so I've got to track the windows with a tab(I just have the tab use an image of a number from google image search so that I can easily identify it's group in the overview/expo previews. Terminals are also an annoyance, though I think restoring their individual sessions/history is possible, I just haven't looked into that yet, VMs would resolve most of this though.
              Unfortunately, it's too much hassle. I already have enough trouble allocating sufficient human time to improve and maintain my setup, and, as the startup time for my desktop should make you suspect, I can't justify adding that much overhead to the system.

              Also, I'm running a pre-PSP AMD chip, so Intel VM migration isn't an option.

              Comment


              • #47
                Originally posted by polarathene View Post

                Sounds like you should be running a bare server not a DE as the host system. That'll have no issues with uptime, and you can keep your services/appliances running via VMs no issue. The other use cases that need DE with GUI apps would be served well with a VM per client you have. If it's a driver issue, perhaps a virtual video driver like VMWares might be more stable for you, or you can more easily use whatever DE/OS suits you best on a per case basis.
                Trust me, 20 years of building business from service providers to large enterprise, I get it. I have a rack dowstairs full of dell c6120's, fiberchannel sans, 1-40gbe switching, but a few years ago I just shut most of it down as under-utilized. I got tired of the heat and power usage, and didn't need/want it anymore.

                I bought a precision 7920 with dual cpu's, built it for bear with128gb ram, nvme disks, and gpu, and worked great for years reproducing all that virtually, until it didn't any more at least.

                Originally posted by polarathene View Post

                I'm always working with fixed displays, I don't change the display count on a system so I haven't experienced that. I do remember reading about something like that getting attention in past months, so perhaps it arrived with Plasma 5.18 or a recent monthly KDE Frameworks update?
                Maybe they got tired of squeaky wheels after 10 years. Thank goodness however.

                The hdmi-based "TV" displays don't behave like a normal dpms-based monitor in power-down, they entirely detach on the wire, so something of an anomaly based on choice in display that causes the os to see it removed and windows repositioned each time. I'd like to think I'm not the only one to see this, but bigger issue is probably not knowing where exactly to report it to.

                Originally posted by polarathene View Post

                My laptop with only 2 cores i3 CPU and 4GB RAM, no dGPU only the Intel iGPU can boot with ~500MB of RAM in use, and 0-1% CPU idle, desktop is probably similar, but I know it gets worse over time, especially when kwin fails and compositing takes a dive as a result.
                I don't even know how that would work with only 4gb of ram. On mine barely idle, kde uses around 3.5gb of ram, usually kwin with a lot of it with my laptop having a 4k display too. I launch chrome with 5 profiles, it takes another 10-14gb instantly. Firefox setup the same wants about 7gb of ram. Libreoffice wants another good 5-10gb or so with my normal workflow. Throw in zoom (~3gb), slack (~2gb), win10 vm (8gb), other native software, it gets interesting.

                It is a bit of a stretch for this laptop with the intel gpu driving it 100%, but it works surprisingly well - without kwin. I had to bump ram from 32 to 64gb, but much happier now.

                Originally posted by polarathene View Post

                Each option has issues afaik. Nvidia gets a lot of shit, but despite all the praise AMD gets, there's been plenty of cases when it's been pretty bad too. Sometimes it's specific GPU models/products that you need to do a bit of research into(harder if it's fairly new product), other times it's just needing newer kernel/mesa, or waiting several years. If you don't need much vRAM or GPU grunt, going with older GPU are apparently good, not too old though(I got an R240 or something, budget GPU, but it was one generation too old to benefit from something that I wanted/cared about). RX580 is apparently decent these days, might serve you well?
                I used an older 6970 triple-slot asus card for years with oss drivers to feed 6x 1080p lcd's that were my prior desktop here. That thing was absolutely solid for desktop use, without compositing. Once it got to the point everything was compositing and wanted it, things got more unreliable, again mostly due to kwin. I'm good with an older card, just need to make sure it has enough ram for my large framebuffer and can support my 3x export of dp1.2 or hdmi2.0. I trust the drivers to be better supported as well with newer kernels I tend to run with Arch.

                Originally posted by polarathene View Post

                Could be due to hardware limitations? I don't know your particular setup or device capabilities, but I know from stuff like USB that there is a lot of gotchas despite whatever marketing claims, where they cite the protocols specs, but things like chipset and cable quality, the power supplied to that chipset from the device, and the target device(s) you connect to all contribute to what actually is not only supported but capable of. If the problem is consistent across OS and it seems like it shouldn't be an issue, it's probably due to hardware then.
                Probably yes - coming from technology as primarily a "network/security guy" for 20 years, I know about hardware limitations. I don't see any obvious hardware limits in my systems at least, and I have dug with about every tool I could find. Why my last build, I built it like a server and uncorked things with massive specs - software still comes unglued.

                Oddly I see things like compositing work, and then things get sloppy later, a day, a week, months later. Why I suspect kwin is just a resource-leak in motion, but most compositing engines suffer this eventually. Maybe the gpu, but hard to quantify. Not so much ever ram, it was hard for me to blow out even virtual allocation on 128gb of ram, but not so much on 32gb before I upgraded my xps15 to 64gb. It does come apart, which I presume is some general instability of some state in software no one notices that doesn't have 3x 4k displays in use on linux. It's probably only me using kde with either the nvidia 1070gtx in my desktop, or my lowly intel laptop at this 11520x2160 resolution normally.

                For better or worse, I likely am their high-end benchmark pushing boundaries in platforms, now if only I could get someone to even notice, and even possibly help determine/fix their scaling issues.

                Comment


                • #48
                  Originally posted by mikus View Post
                  I don't even know how that would work with only 4gb of ram. On mine barely idle, kde uses around 3.5gb of ram
                  That doesn't sound right.. Talking about a fresh boot of the system to the desktop. If I include dolphin/kate/terminals and the like from session restore, it's more like 600-700MB atm. Open other software like a web browser and restore all it's tabs and that will add many GB on the desktop, on the laptop I have to keep the amount of tabs/windows for a browser managed due to low ram.

                  Regarding the server host setup for a solution, you don't have to change hardware. If things are working on your current hardware, I was just referring to using it as the host system/server, and then managing separate VMs for services or DEs as needed. Should resolve or better work around many of the issues you've cited, but may be a hassle getting setup.

                  Originally posted by mikus View Post
                  I'd like to think I'm not the only one to see this, but bigger issue is probably not knowing where exactly to report it to.

                  For better or worse, I likely am their high-end benchmark pushing boundaries in platforms, now if only I could get someone to even notice, and even possibly help determine/fix their scaling issues.
                  You should probably report to KDE's bugzilla. If you're not sure where perhaps you can ask the community for some guidance, but generally they expect you to direct bug reports to bugzilla and not rant on their community channels about it much where it'll get lost over time. r/kde subreddit is pretty good, just chime in about your issue(be more terse as most aren't likely to read paragraphs of text), and someone will probably let you know where on bugzilla it belongs. Alternatively just take a guess, and a dev will relocate it to the appropriate place if it's in the wrong one.


                  Comment


                  • #49
                    Originally posted by mikus View Post
                    The hdmi-based "TV" displays don't behave like a normal dpms-based monitor in power-down, they entirely detach on the wire, so something of an anomaly based on choice in display that causes the os to see it removed and windows repositioned each time. I'd like to think I'm not the only one to see this, but bigger issue is probably not knowing where exactly to report it to.
                    On X11, even with LCDs, I've found the KScreen 2 daemon to have a penchant for confusing Plasma about screen assignments and which panel should go where, so I disable it on my desktop (KRunner → Background Services)... assuming I don't just use nVidia's MetaModes option to lock the desktop resolution to keep games' first-run resolution changes from confusing KWin into squashing everything into a single monitor.

                    Comment


                    • #50
                      Originally posted by polarathene View Post

                      That doesn't sound right.. Talking about a fresh boot of the system to the desktop. If I include dolphin/kate/terminals and the like from session restore, it's more like 600-700MB atm. Open other software like a web browser and restore all it's tabs and that will add many GB on the desktop, on the laptop I have to keep the amount of tabs/windows for a browser managed due to low ram.
                      It is what it is - it's simply the result of how I'm working here. I rebuilt my xps15 from ubuntu to arch clean a few years ago to match my desktop, and it has the exact same ram utilization as my desktop did, and (k)ubuntu was the same prior. I'd love to know what I'm doing wrong, other than using a lot of tabs and libreoffice windows that I simply need 32-128gb of ram to base function.

                      Originally posted by polarathene View Post

                      You should probably report to KDE's bugzilla. If you're not sure where perhaps you can ask the community for some guidance, but generally they expect you to direct bug reports to bugzilla and not rant on their community channels about it much where it'll get lost over time. r/kde subreddit is pretty good, just chime in about your issue(be more terse as most aren't likely to read paragraphs of text), and someone will probably let you know where on bugzilla it belongs. Alternatively just take a guess, and a dev will relocate it to the appropriate place if it's in the wrong one.


                      https://community.kde.org/Get_Involved/Issue_Reporting
                      The window placement issue is obviously a KDE problem, but the cause is more the display handling at either a kernel, video driver, or software layer (xorg, xrandr, kde). The problem I run into if I go to KDE, they tell me to talk to NVidia. If I talk to NVidia, they tell me to talk to KDE, or upstream to the kernel. Round and round we go. Why I had to simply abandon my desktop, as I went through this dance with the GLES driver issue randomly appearing on my desktop - no one (kde, nvidia, or kernel) had any good suggestion, and just kept referring me elsewhere or to try reinstalling everything. I thought I ended that with windoze...

                      Like right now, I got a new thunderbolt3 dock for my laptop, as my TB16 Dell unit was failing (bad cable, not replaceable), insert random chinese replacement. Every time my monitors get shut down by os for power save (normal monitor dpms mode to shut down), my 3rd monitor every time comes back in 1920x1200 mode, and refuses to go back to 4k until I disconnect the dock, and restart it. Is it the dock, is it nvidia drivers, is it kde, xrandr, or is it the kernel?

                      When it does this, often I can go into KDE and set it to 4k again, sometime I can't like the display is only reporting 1080p as max. If only 1080p max, disconnecting and reconnecting the TB port often works. Sometimes I just end up having to reboot to get my 4k back fully. Trying to chase down who's problem it really is under linux is a bad game of whack-a-mole.

                      The good news is they *did* fix the window placement issue, as once I get the displays reporting right after a reconnect, it moves things mostly back to the right place. Mostly... It doesn't on the 3rd display that changes resolution, but far improved from simply slamming all the windows into some default corner that is always simply wrong.

                      Comment

                      Working...
                      X