9 months ago I put my time where my mouth was and installed Arch Linux, and only Arch Linux, for my daily machine. Two days ago, I reinstalled Windows on another hard drive (please listen to my story before banning me for heresy). I gave Linux and Microsoft the same chance: I took the most modern (or nearest) version with every option that seemed to make sense for my usage, and gave them both a long test drive. Arguably 2 days of Win 11 is not enough to judge its quality, but since this is mostly a more enclosed and visually evolved Win10, which I used for years, I feel like I know what to expect. Also this isn't going to be about the state of Windows.
Out of the 9 months where I used Linux squarely, I had three goals:
One, I sincerely think the desktop experience is better. Whether with KDE Plasma or Xfce, Linux offers so many customisation options, so much openness in creating the desktop keybinds, visuals, placements, even smaller details like handling your notifications the way you want, how fast you want the taskbar to disappear when you have a fullscreen app...the level of customisation is simply making the experience far superior. I have left my KDE desktop not 3 days ago, and the Win 11 desktop, for all the simplicity of using it, is not going to equal it at any point. Everything feels competent on Windows, but slightly inadequate, not the way I want it, not good enough. KDE Plasma and Xfce gave me pretty much a pristine experience, although Xfce really didn't like it when I switched to 4K(taskbar and most stuff disappeared off the screen and I was pretty much forced to stick to KDE).
Two, Linux is indeed giving you a rather full fledged experience. It took me 6 months before I decided that the experience wasn't worth continuing. If Linux had been half as incapable as to what most reviewers say about it, I'd have dropped out before a month. Linux does satisfy most requirements quite well, and doesn't really falter badly enough in any way that was meaningful to me. When it does fail (games that won't start), there are usually enough helpers and community users online to at least guide you in the right direction.
A good example: I was an Overwatch player and the game was recently axed by the infernal monstrosity known as Activision Blizzard, and replaced with a pathetic excuse for a sequel called Overwatch 2 (which has broadly ruined the Overwatch experience and feel). On day 1 of OW2 coming out, I wanted to try it. And it wouldn't work. Lutris started Battle.Net.exe just fine, but Bnet starting OW2 would instantly fail. I went on Reddit to ask for help, and a nice person had already written advice on what program to use to make OW2 compatible. It worked immediately as a drop-in replacement for part of Lutris, and the game worked mostly well since. It would often fail on launch, but after 2-3 tries, it launched.
Is that massively inferior to the Windows experience? Yes, because you don't have any tweaks to do, you don't need to compile shaders (which for OW1 was once every update so about 2 weeks, and for OW2 somehow became every single startup), you don't have to ask the program to start 3 times, you never need to call killall on it. But was it bearable? Yes, acceptably so to me.
Linux gets a pass for running most things well enough that I can't call the experience unacceptable, and that's a large plus as a competitor.
Three, Linux's sense of ownership of your machine is indisputable. Unlike MS's rabid forced updates, calling yay, paru, or pacman, felt like an actual tool that was designed to help me. I installed things when I wanted them, how I wanted them, and no force was ever put on me. The sense of privacy that accompanied that is also very strong. Linux gives you the feeling, from start to finish, that you are not a client of the Internet or any member of a greater botnet, but a completely independent machine able to tap into the Net rather than be its component. It's going from being a renter to an owner. That sense of ownership and privacy is completely absent in any platform from any big company, Apple, MS or Google.
Those three positives really go to Linux's credit and I believe that they will never be displaced by any of the bigger companies. They are undeniably Linux's strengths.
They however came with three negatives.
First, Linux does cover arguably 95% of the terrain that Windows covers. But those last 5% hurt it much more than one might think. Most of us here(all of us?) are devs, so let me remind you of a painful memory: that one time where you spent 3 days debugging a thing that for the life of you wouldn't work right. You went home tired after day 1, disgusted after day 2, and absolutely depressed after day 3. Whether it worked at day 3's end or day 4's start, in the end, the effort left you drained, annoyed, mentally and physically unwilling to do anything but transform into a couch potato, preferably with alcohol.
But what about that other time, where after 3 days of debugging, you realised that actually, the thing you wanted to do was not possible at all? That absolute fatigue of life when you put in hours of mental strain trying to get to finally crack it, thinking that it will eventually melt away, and then you get rewarded absolutely nothing for all your efforts. One of these moments where you regret having become a developer.
Two examples of this: my new monitors, and Civ VI.
One of my monitors finally died after years of good services, and I was itching to try that 4K everyone talks about. I bought a Gigabyte M32U, a solid beast with 4K 144Hz max framerate, nice colours, surprisingly good sound, all around strong. I installed it and ran it and was mesmerized by the 4K crispness of text, reading code hurt my eyes 10 times less, I stopped thinking that I needed to change my glasses(I actually do need to change my glasses...), it was one of these "There is no returning from this" moments. I even bought a second 4K monitor because looking at my other old 1080p mono made me cry (literally, my eyes kept refocusing and it was painful).
When that monitor got installed, I wanted to test that fresh 144Hz experience. I went into Plasma's display settings and tried to set it to 144Hz. Not possible. The best option was 120Hz. I then did as a Linux user does, and went around to scour the internets to find the mythical config file or flag that would release me from the semi-great experience into the Greatest Experience. And after some 20 mins, I found it: my monitor was plugged into HDMI. My RDNA2 card handles HDMI 2.1. My monitor obviously handles it too. But the Linux Kernel doesn't. Which I understand, since there's not even 1% of Linux users that get and daily use 4K 144Hz monitors. But when I compared with how long ago the consumer-oriented Windows had HDMI2.1 covered, it stung. Sure it's only 24Hz, and it's really no biggie to lose 20 minutes. But to pay 800€ a monitor and to get a hampered version of it because Linux just isn't that interested in HDMI2.1 support, well, it did sting.
Out of the 9 months where I used Linux squarely, I had three goals:
- Run Linux alone, no VMs, no dual boots, no Microsoft at all. Even when my job required MS Teams, I used it through the browser alone and did not install it. Leave MS behind for good and all.
- See what the Linux Experience was when given the (IMO) most Linux of Linuxes, Arch, and run it as I would Windows. I am a gamer, I am an occasional streamer as well, I run a ton of multimedia stuff, music, videos, online and offline. All that Windows offered, I needed Linux to offer it.
- Try to not only run everything on Linux, but compare honestly what was better and worse than Windows.
One, I sincerely think the desktop experience is better. Whether with KDE Plasma or Xfce, Linux offers so many customisation options, so much openness in creating the desktop keybinds, visuals, placements, even smaller details like handling your notifications the way you want, how fast you want the taskbar to disappear when you have a fullscreen app...the level of customisation is simply making the experience far superior. I have left my KDE desktop not 3 days ago, and the Win 11 desktop, for all the simplicity of using it, is not going to equal it at any point. Everything feels competent on Windows, but slightly inadequate, not the way I want it, not good enough. KDE Plasma and Xfce gave me pretty much a pristine experience, although Xfce really didn't like it when I switched to 4K(taskbar and most stuff disappeared off the screen and I was pretty much forced to stick to KDE).
Two, Linux is indeed giving you a rather full fledged experience. It took me 6 months before I decided that the experience wasn't worth continuing. If Linux had been half as incapable as to what most reviewers say about it, I'd have dropped out before a month. Linux does satisfy most requirements quite well, and doesn't really falter badly enough in any way that was meaningful to me. When it does fail (games that won't start), there are usually enough helpers and community users online to at least guide you in the right direction.
A good example: I was an Overwatch player and the game was recently axed by the infernal monstrosity known as Activision Blizzard, and replaced with a pathetic excuse for a sequel called Overwatch 2 (which has broadly ruined the Overwatch experience and feel). On day 1 of OW2 coming out, I wanted to try it. And it wouldn't work. Lutris started Battle.Net.exe just fine, but Bnet starting OW2 would instantly fail. I went on Reddit to ask for help, and a nice person had already written advice on what program to use to make OW2 compatible. It worked immediately as a drop-in replacement for part of Lutris, and the game worked mostly well since. It would often fail on launch, but after 2-3 tries, it launched.
Is that massively inferior to the Windows experience? Yes, because you don't have any tweaks to do, you don't need to compile shaders (which for OW1 was once every update so about 2 weeks, and for OW2 somehow became every single startup), you don't have to ask the program to start 3 times, you never need to call killall on it. But was it bearable? Yes, acceptably so to me.
Linux gets a pass for running most things well enough that I can't call the experience unacceptable, and that's a large plus as a competitor.
Three, Linux's sense of ownership of your machine is indisputable. Unlike MS's rabid forced updates, calling yay, paru, or pacman, felt like an actual tool that was designed to help me. I installed things when I wanted them, how I wanted them, and no force was ever put on me. The sense of privacy that accompanied that is also very strong. Linux gives you the feeling, from start to finish, that you are not a client of the Internet or any member of a greater botnet, but a completely independent machine able to tap into the Net rather than be its component. It's going from being a renter to an owner. That sense of ownership and privacy is completely absent in any platform from any big company, Apple, MS or Google.
Those three positives really go to Linux's credit and I believe that they will never be displaced by any of the bigger companies. They are undeniably Linux's strengths.
They however came with three negatives.
First, Linux does cover arguably 95% of the terrain that Windows covers. But those last 5% hurt it much more than one might think. Most of us here(all of us?) are devs, so let me remind you of a painful memory: that one time where you spent 3 days debugging a thing that for the life of you wouldn't work right. You went home tired after day 1, disgusted after day 2, and absolutely depressed after day 3. Whether it worked at day 3's end or day 4's start, in the end, the effort left you drained, annoyed, mentally and physically unwilling to do anything but transform into a couch potato, preferably with alcohol.
But what about that other time, where after 3 days of debugging, you realised that actually, the thing you wanted to do was not possible at all? That absolute fatigue of life when you put in hours of mental strain trying to get to finally crack it, thinking that it will eventually melt away, and then you get rewarded absolutely nothing for all your efforts. One of these moments where you regret having become a developer.
Two examples of this: my new monitors, and Civ VI.
One of my monitors finally died after years of good services, and I was itching to try that 4K everyone talks about. I bought a Gigabyte M32U, a solid beast with 4K 144Hz max framerate, nice colours, surprisingly good sound, all around strong. I installed it and ran it and was mesmerized by the 4K crispness of text, reading code hurt my eyes 10 times less, I stopped thinking that I needed to change my glasses(I actually do need to change my glasses...), it was one of these "There is no returning from this" moments. I even bought a second 4K monitor because looking at my other old 1080p mono made me cry (literally, my eyes kept refocusing and it was painful).
When that monitor got installed, I wanted to test that fresh 144Hz experience. I went into Plasma's display settings and tried to set it to 144Hz. Not possible. The best option was 120Hz. I then did as a Linux user does, and went around to scour the internets to find the mythical config file or flag that would release me from the semi-great experience into the Greatest Experience. And after some 20 mins, I found it: my monitor was plugged into HDMI. My RDNA2 card handles HDMI 2.1. My monitor obviously handles it too. But the Linux Kernel doesn't. Which I understand, since there's not even 1% of Linux users that get and daily use 4K 144Hz monitors. But when I compared with how long ago the consumer-oriented Windows had HDMI2.1 covered, it stung. Sure it's only 24Hz, and it's really no biggie to lose 20 minutes. But to pay 800€ a monitor and to get a hampered version of it because Linux just isn't that interested in HDMI2.1 support, well, it did sting.
Comment