Announcement

Collapse
No announcement yet.

X.Org Server 1.20.7 Released With A Handful Of Fixes For GLAMOR + Modesetting

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by skeevy420 View Post

    I find it hard to believe you went through the systemd upbringing and the changing of init systems, the /usr merge, the introduction of multiple repositories like core and community, the introduction of stable and testing, the dropping of i686 and unstable, the picking up of 64-bit, and more...all completely unscathed. None of that ever caused you to be in a reinstall position? Not ever? That's very unlikely and improbable.

    To be pedantic -- 15 years ago, Arch was a 32-bit only distribution. In April this year, 64-bit Arch will be 14 years old.

    So, with that said, are you really telling me you started on a 32-bit Arch Linux and that same install, 15 years later, PC to PC, is still working and that, worst case scenario, all you've needed to do was a few chroot fixes? I'm usually a really trusting person and will take people and posts at face value, but I just cannot believe that not one time ever in 15 years that you never reinstalled Arch because you had to and that you only did it because you wanted to.

    I don't care what OS you're using, using it for 15 years without some issue ever coming up that requires a reinstall sounds very unbelievable.

    Are you combining need to's and want to's or that you didn't time your need to's with your want to's to coincide with one-another? I know that I combined them for the /usr merge and the introduction of stable and testing. That's reasonable, believable, and what a lot of Arch users do over the long term. I know that I'll also call that kind of a scenario "an update" even though it technically is a reinstall where I didn't wipe /home midway through....like those situations where you read the Arch News and know that your system will break following the next "pacman -Syu" and you can either spend an hour in a chroot to fix it or spend an hour doing a reinstall because either scenario will have the desired outcome...
    "[...] the introduction of stable and testing"? The testing repo is there since I can remember, what's up with that?
    I said, I never needed/had to reinstall (in the classic sense). Everything you mentioned is/was possible w/o reinstalling. I did it few times because I wanted to, and that was the only reason.

    Comment


    • #22
      Originally posted by skeevy420 View Post

      I find it hard to believe you went through the systemd upbringing and the changing of init systems, the /usr merge, the introduction of multiple repositories like core and community, the introduction of stable and testing, the dropping of i686 and unstable, the picking up of 64-bit, and more...all completely unscathed. None of that ever caused you to be in a reinstall position? Not ever? That's very unlikely and improbable.
      Dude, you're really out of it. I use Arch since 2003, and me too, I never had to reinstall Arch. Yes there was a transition between i686 and amd64, but for me that was when I finally bought a 64 bit machine, I installed the amd64 version. So from 2003 and 2020 I changed computer four times, but installed Arch only two times (in 2008 and in 2015), because the other two times I carried the installation over.

      So your funny list of events that I went through without reinstalling:
      [x] transition to systemd … made the change when I felt like it, there was a sysv compat package too that I later just dropped
      [x] /usr merge … really not much to see there
      [x] introduction of multiple repositories … well I think there was always core and extra, adding [community] wasn't too hard
      [x] introduction of stable and testing … never had to care about testing: It's for the devs to test before I get the package with stable, so…
      [x] dropping of i686 … used some lib32-* packages for quite a while on amd64, e.g. for Skype and Wine and such

      Things you forgot:
      [x] transition to signed packages
      [x] transition from devfs to udev (this was really tough!)
      [x] the evolution of udev
      [x] transition to sysfs
      [x] transition to dkms
      [x] transition to mkinitcpio
      [x] all the various AUR helpers and changes to makepkg and PKGBUILD, e.g. hooks and licenses

      … and countless other innovations that completely changed the GNU/Linux desktop in the last 18 years. Yes I went through all that with just two new installs of Arch on new hardware.

      Now if you had to reinstall Arch Linux all the time, I guess, well, PEBCAC.

      Here some images from December 2005 for nostalgia:

      Last edited by ypnos; 14 January 2020, 07:02 PM. Reason: some images

      Comment


      • #23
        Originally posted by betam4x View Post
        We need to move away from this 'LTS' line of thinking. LTS means little in the context of support of a free operating system, and it definitely doesn't suggest any sort of stability.

        One other thing: If you think NVIDIA is the least bit pressured by Ubuntu, you are wrong.
        I didn't say LTS equated to stability?(despite the S in the LTS part)

        Ubuntu is a fixed point release distro to me, but in terms of stability, it's slower moving, there's still updates inbetween the fixed releases(April/October), but as has been said, plenty of software is held back as a result from receiving updates between those fixed releases/updates, thus bugs are present, however the benefit is those bugs are more likely to be known and "reliable" in the sense that they persist until a bigger update arrives to potentially address them, and before then, you should not encounter any new bugs, far less likely to than you would a rolling distro. So stability to me is more that it's predictable/reliable, not so much that it's less likely to crash or have other issues.

        You'll also find Ubuntu has a relatively large community and is generally a priority target for software support(eg numerous steam games(and steam itself shipping ubuntu packages/libs), vmware, CI services, VPS providers, etc). And further so with the LTS releases every two years. I don't use Ubuntu personally and I have not for many years, but I don't play down it's role and importance within our Linux community.

        So fixed point releases vs rolling releases, stability itself really depends on your definition. I can have Manjaro and call it stable too if I don't update it(had numerous times I encountered breakages, also with Arch), but there's more benefit to an LTS release of a distro than an arbitrary choice to halt updates(at least the fixed point releases do get some updates like security fixes, kernel backport updates, support on the versions of software they have installed, and some ability to get some newer kernels or software if needed still). I use rolling release for personal system, but would be reluctant to not have at least a fixed release distro available at work in the event something went wrong with the rolling release(granted with certain setups like BTRFS and openSUSE defaults you can roll back easily when stuff breaks on TumbleWeed, so not as bad/risky).

        Regarding pressure to nvidia, if it got to the point that Ubuntu LTS releases were defaulting and pushing Wayland hard vs X11, chances are by that time other distros are already doing the same, especially rolling release ones. If we had reached that transition point, then nvidia would have to step up it's support for Wayland as what follows is the distros targeted at professional workstation use(if not already adopted Wayland by default at that point in time), this is generally where nvidia gives a shit regarding linux desktop as that's where the paying users(companies, eg RedHat/IBM and SUSE for distros and companies like those in VFX industry) and support are beyond GPU compute workloads.

        Comment


        • #24
          Originally posted by Britoid View Post
          Noob helpers are stupid too, they shouldn't exist. If a noob helper is needed something has gone wrong, and often the noob helpers create more issues than they're worth.
          Ehh... sometimes they're alright(not referring to Manjaro in general), and to a degree depends on what you actually consider a noob helper. I do find manjaro's hardware manager thing a bit annoying. Once I tried to use Manjaro in a KVM UEFI VM, and it would refuse to boot because the hardware helper was "detecting" VGA/BIOS system instead of UEFI for some reason and thus trying to initialize drivers via "int10" interrupt which doesn't work on UEFI, if X11 instead was allowed to auto-detect itself, it did it correctly and you'd get an actual display output. Even told the developers about it and they didn't seem to care or bother to fix it.

          If anyone does arrive here from google, to resolve it you can use kernel boot param for systemd to disable the mhwd service as a workaround.

          Comment


          • #25
            Originally posted by skeevy420 View Post

            For workstations working on long-term projects and various servers, LTS does make sense because a feature update could completely screw a company or person's workflow or you could be using BTRFS and the updated kernel isn't quite happy with your file system configuration so goodbye data. That's because stability in a lot of LTS's means that the platform won't have any drastic changes and is, therefore, stable; not stability as in bugs are less likely to occur so our platform is, therefore, stable.

            That's why, IMHO and for the most part, LTS is long-term stale and not long-term stability. All one has to do is look at random Steam bug reports to find all the examples they'd need that LTS isn't good for the average desktop user or new Linux user. The amount of LTS distribution users that have bugs and the non-LTS users going XYZ fixed that 6 months or a year ago is, honestly, hilarious.

            ....unless we're talking about projects like KDE Plasma or the Linux kernel that have both LTS and Mainline releases. In cases like that we're actually getting a long-term supported project and not SUSE or Ubuntu or *insert distribution here's* rendition of an LTS project. I've seen both Ubuntu and SUSE feature freeze at utterly retarded times like not waiting a few more days and using a Linux LTS release for their LTS distribution....nope, we're gonna stick with the kernel release right before the LTS release and backport a bunch of shit because that makes a lot of fucking sense.

            I don't use Nvidia. Fuck Nvidia.

            AMD FTW
            Your first point is incorrect. I've simply run apt update on an 'LTS' release only to have my system no longer boot due to an 'LTS' kernel being updated to a buggy version. I've also had similar things happen on Debian. That argument simply does not hold water. Furthermore, I do more work in Linux than most. I build websites from the ground up (including graphics design, UX design, etc.), build, edit, and release YouTube videos and private training videos, and I've also host and maintained servers on Amazon AWS, Digital Ocean, Google Cloud, Microsoft Azure, Linode as well as dedicated servers for extended periods of time. My takeaway is that it doesn't matter if you use an 'LTS' release or not. Each equally has a chance of breaking (and the overall chances of anything breaking are low as long as your hardware is compatible and you aren't doing weird stuff to your system, I've only once had Arch break, for example, and that was due to me doing weird stuff! *surprise*). The only exception to this is Red Hat Enterprise Linux where you can get paid support for when things DO break, but if you are in that boat, this comment does not apply.

            I've also never had a piece of software stop working due to an update. If I ever had an issue, I know how to downgrade the package or provide a local copy of the files.

            Regarding your last comment stating 'fuck NVIDIA', I don't take sides. NVIDIA right now is the only one with a competitive offering in the graphics market. That will change later this year, and when it does, I'll likely upgrade back to an AMD card (as I have in the past), however, if their offerings aren't competitive, I'll vote with my wallet. NVIDIA drivers work just fine on Linux. The reason Wayland does not work properly is the fault of Wayland. Wayland itself has many issues which myself and others have attempted to bring up many times. Wayland is not gaming friendly, hoists too much work on other developers, and is immature (and was VERY incomplete at initial release). The folks who designed and developed Wayland did not think about all use cases when they designed it. They didn't even think about users that wanted to use the clipboard to copy/paste stuff! (Wayland now has clipboard support thanks to various clipboard providers, but that was a problem earlier on.) So no, I don't blame NVIDIA for anything. I do not see open source as the end-all be-all solution to everything. It is great that there is an open source community and I hope it continues, but folks need to realize the reality that not everything can be open sourced for a number of reasons (which I can happily go into detail if you'd like).

            Comment


            • #26
              Originally posted by betam4x View Post

              Your first point is incorrect. I've simply run apt update on an 'LTS' release only to have my system no longer boot due to an 'LTS' kernel being updated to a buggy version. I've also had similar things happen on Debian. That argument simply does not hold water. Furthermore, I do more work in Linux than most. I build websites from the ground up (including graphics design, UX design, etc.), build, edit, and release YouTube videos and private training videos, and I've also host and maintained servers on Amazon AWS, Digital Ocean, Google Cloud, Microsoft Azure, Linode as well as dedicated servers for extended periods of time. My takeaway is that it doesn't matter if you use an 'LTS' release or not. Each equally has a chance of breaking (and the overall chances of anything breaking are low as long as your hardware is compatible and you aren't doing weird stuff to your system, I've only once had Arch break, for example, and that was due to me doing weird stuff! *surprise*). The only exception to this is Red Hat Enterprise Linux where you can get paid support for when things DO break, but if you are in that boat, this comment does not apply.

              I've also never had a piece of software stop working due to an update. If I ever had an issue, I know how to downgrade the package or provide a local copy of the files.

              Regarding your last comment stating 'fuck NVIDIA', I don't take sides. NVIDIA right now is the only one with a competitive offering in the graphics market. That will change later this year, and when it does, I'll likely upgrade back to an AMD card (as I have in the past), however, if their offerings aren't competitive, I'll vote with my wallet. NVIDIA drivers work just fine on Linux. The reason Wayland does not work properly is the fault of Wayland. Wayland itself has many issues which myself and others have attempted to bring up many times. Wayland is not gaming friendly, hoists too much work on other developers, and is immature (and was VERY incomplete at initial release). The folks who designed and developed Wayland did not think about all use cases when they designed it. They didn't even think about users that wanted to use the clipboard to copy/paste stuff! (Wayland now has clipboard support thanks to various clipboard providers, but that was a problem earlier on.) So no, I don't blame NVIDIA for anything. I do not see open source as the end-all be-all solution to everything. It is great that there is an open source community and I hope it continues, but folks need to realize the reality that not everything can be open sourced for a number of reasons (which I can happily go into detail if you'd like).
              In all fairness to my first point I did add this to the end about LTS kernels -- nope, we're gonna stick with the kernel release right before the LTS release and backport a bunch of shit because that makes a lot of fucking sense. LTS is usually fine if they feature freeze at the right time, but freezing on 4.13 when 4.14 will be out in a week is a very asinine decision to make.

              The Ubuntu committee meeting with their lead kernel dev:
              Well, gentlemen, they're gonna be dropping 4.13 support and moving to 4.14LTS right about when we'll release our next LTS beta RC1.

              Brilliant, 4.13 sounds like a perfect kernel to use for four years.

              Oh no, I'm suggesting that weuse the 4.14 kernel with extended support.

              No. Have your team backport 4.14 into 4.13 so people know how much ass we kick.

              Kernel dev to self: Well fuck me running.
              I've had a few things stop working due to updates like that freakin ISO tool when it removed Windows ISO support...Unetbootin...but that's more along the lines of dropping a feature than it just not working anymore (although that was damn annoying). I've also had a few glibc issues over the years after updates, but Arch/Manjaro is better about rebuilding all their stuff these days. But, yeah, outside of the occasional rolling hiccup or feature drop, can't really say that I've had too many issues like that either.

              My issues are usually crap like: Terrific, Kernel X.Y doesn't like BTFRS compression and now my root is FUBAR; Dammit, I did not mean to "zpool updgrade boot-pool"; systemd 240 (what a horrible few days that was). You know, mostly crap you bring on yourself due to using bleeding edge and/or non-standard features.

              While I am an AMD user on Linux, on any other OS I don't really care. Even though I do feel the need to buy AMD because they're more Linux friendly, as someone who buys assloads of proprietary software, you know, games, it's disingenuous to hate on Nvidia over being proprietary. We can blame/hate on Nvidia for doing their own crap when everyone else is following the standards like with EGL Streams. That said, this is a Linux site so I do post with my Linux bias.

              Comment


              • #27
                Originally posted by skeevy420 View Post
                I find it hard to believe you went through the systemd upbringing and the changing of init systems, the /usr merge, the introduction of multiple repositories like core and community, the introduction of stable and testing, the dropping of i686 and unstable, the picking up of 64-bit, and more...all completely unscathed. None of that ever caused you to be in a reinstall position? Not ever? That's very unlikely and improbable.
                That can easily be true. I have also never reinstalled my Slackware in 17 years, always just in-place updated it with new packages, replaced 32bit with amd64 packages, copied to new hdds and new machine, etc. I assume it is much easier on Slackware than Arch as there is no systemd, so any new system feature are just lines in rc files of sysvinit startup scripts, so the new additions are easily merged with your existing scripts.

                Comment


                • #28
                  Originally posted by aceman View Post

                  That can easily be true. I have also never reinstalled my Slackware in 17 years, always just in-place updated it with new packages, replaced 32bit with amd64 packages, copied to new hdds and new machine, etc. I assume it is much easier on Slackware than Arch as there is no systemd, so any new system feature are just lines in rc files of sysvinit startup scripts, so the new additions are easily merged with your existing scripts.
                  Not saying that it isn't possible, but I do find it highly unlikely for the vast majority of people. Especially so on a distribution like Arch where plenty of system breaking things have slipped by in the past.

                  Then again, because of those issues slipping by, I keep my stuff set up in a manner where reinstalls are just easier than fixing it, because, well, I'm always trying out different stuff and end up with assloads of junk packages so I'm biased to start over rather than to fix it. I also think my reinstall bias has a lot to do with growing up with Windows 98.

                  I'm still very skeptical of y'all, bias or not

                  Comment

                  Working...
                  X