Announcement

Collapse
No announcement yet.

What Are The Biggest Problems With Linux?

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by gaunilo View Post
    Linux scares people. They feel helpless.

    What do I do if I get no internet connection/ my printer does not print/ I don't hear any sound? Uninstalling and installing again usually does not resolve issues as on other OS. If something goes wrong you sometimes have to read man pages and edit config files. Maybe even in shell mode. Being in such a situation must be a horror to somebody who only wants to surf the net and read emails. Standard components like sound, internet, printing and graphics should behave themselves much better with much more testing. Danger spots should be cushioned by safe modes.

    Why not have senior citizens install Linux and see how they fare in comparison to other OS? I bet much could be learned from that. I believe Linux still has to become a little less rough around the edges to attract a wider audience.
    Re-installing is an artifact from some other OS, because it's, well a pile of crap that just randomly fails.

    Linux generally does as it is told. If your internet connection fails, re-installing won't help, if you don't have a driver for your wifi. This issue exists on any OS. Seen this happen often enough on windows too. Installing XP (may be a company requirement for example) without slip-streamed Sata support/wifi support. Who fixes that? Your sysadmin makes a new installer with slip-streamed drivers. In case of Linux, your best bet would be to either use wired and see if your wifi is support. Just as an example. It sucks, yes, but can't blame Linux for hardware vendors not willing to support their own hardware properly.

    So forget about 'reinstalling fixes things'. It generally doesn't and it's actually really strange to assume it does.

    You would actually be surprised how well senior citizens can work with linux to surf the web, read the mail etc.

    Comment


    • Originally posted by e8hffff View Post
      Have to agree.

      One example is I install a Laserjet printer driver, and as soon as there is an update patch to CUPS or some other printer infrastructure, my printer stops working, usually. How on hell can a business depend on that. btw, I'm not running a business.

      There needs to be pressure to keep to standards and backward compatibility, else branch into new projects.

      Canonical has been a major mover into standardising and placing systems to allow upgrading a ease of changes. More need to adopt those systems or make their own. Example the PPA system is excellent. LaunchPad.net another good move.
      A business won't have to depend on that. They will properly test upgrades. If they randomly install a new version of cups on their test server, before it going into production and notice that the printer doesn't work anymore, they should be: competent to fix things, that's why they get payed; and have a support channel with their linux vendor. If no support channel is being used, then they will need to do a little research into the issue. That is not an issue for them, that is their job and they should be competent to do this.

      I'm not talking about a major bug that needs sever code rewriting. Just going with your example of a broken printer ppd (driver). Also, as far as I know older ppd's can be used in newer cups versions just fine, and vice versa, but you would not be required to immediately upgrade your cups to the latest version, most of the time.

      There definitely needs to be pressure to keep to standards. But I think Linux is your best bet, when it comes to standards and keeping to them. I'm sure we all haven't forgotten w3c, ie and html to just name something.

      Comment


      • Originally posted by Dandel View Post
        There is a number of big issues, but not all of them is known.

        If your looking at the Kernel only... The list would be...
        1. The absolute biggest would be that there is No stable Driver API between kernels. The Driver API constantly breaks between releases, and this drives away Device makers that wish to support linux. Most device makers want to write a driver once, and then expect it to work for the major version. A Good example of this would be to write a driver for linux kernel 2.6.x where the driver binary would work up until at least version 3.0.0. However, as it stands now the drivers constantly have some form of api breakage between kernel releases.
        2. ACPI/Power Management issues taking battery life from consumer laptops.
        3. Microsoft, and the UEFI Secure Boot. Vendors are likely to support only Microsoft and lock out anything else.
        As mentioned above, Stable API, nope Device makes don't want to support their stuff. If their driver is in mainline, support isn't that big of an issue, small breakage in API/ABI is handled by the kernel devs very often, the ones that break things tend to fix things too. Common decency is of course to keep your own stuff in order as much as possible.
        BTW, 2.6 and 3.0 are really nothing that different, they just changed the way they counted. 1, 2, 30. Basically, but I know what you mean.

        Vendors aren't likely to support MS because only MS asked them, vendors will have to support this, or they won't get a 'runs windows 8' sticker. MS is pretty much forcing them. Linux will hope to try it's best to run on secure boot, but it's MS that will say what can and cannot run on secure boot.
        Originally posted by Dandel View Post
        <snip>[*] Switching between graphics cards without modifying Sessions with ease. For example, I would want to be able to Switch from the integrated graphics card to the dedicated graphics card when I play games, but when I am done, Switch back to the integrated to save power.[*] Incomplete device Support. No support for OpenGL 4.x in open source drivers.
        Switching between graphics seemlessly is a feature that's missing and I can see it's merrits. But it should also be only a temporary thing for the next 5 years I guess. At that time, chips (GPU and CPU's) should be scalable enough, that they can run power efficiently AND power hungry +powerfully. Right now, dual graphics is a 'hack' for the hardware not scaling properly. I think that in 5 years from now, we'll have 16core rigs, where each core can individually be turned off (i think they can only be down clocked right now) with the main core(s) running extremly powerfriendly at low clocks. So even hacks that we see with the latest arm chips, 4 powerfull cores with one 'light' core will go away. Hopefully

        For OpenGL 4, you answerd it yourself below. Patents. Politics is the biggest hurdle at the moment, not only technical.
        Originally posted by Dandel View Post
        <snip>[*]Software Patents that prevent implementation of Specifications. For example Mesa had a lot of issues with the patents over the S3TC patents
        Yes, Software patents that only hold in the US (and korea) to begin with. Screw useless patents. I doubt that the US will ever see the light and abolish them, since there's a lot of money involved. GPL code should be exempt for software patents period. It's just stupid and we all know it. Be it as it may, we still have to comply, right? Hence, opengl 4.
        Originally posted by Dandel View Post
        [*] lack of commercial video games ( Like Unreal Tournament 3, Halo, etc ) being Released natively on linux. Iinstead these games are increasingly being released on Mac OSX using libraries that are cross platform. Generally Speaking, This is a major issue because a lot of these games use libSDL, OpenAL, OpenGL, and a few other libraries that are also on linux.[*]Digital Rights Management demands by major content creators/providers. This on It's own is inherently incompatable with GPL, but is manditory if you want anything to do with consumer media ( Video Games, eBooks, Blurays, DVDs, etc). This is also why there is no decent HDMI capture cards that work on linux that users would want to use. For this, A good example would be wanting to capture a high definition stream of a video game you are playing on a Playstation 3, Xbox 360 or other major console.
        Gaming may improve with valve supporting us more! As for DRM, it goes in the same boat as patent stuff, as above.
        Originally posted by Dandel View Post
        [*]Even though this is not Linux Specific, it is key... The GPL Compliance is abismal, and companies within china are usually the main instigator. This issue will only increase, and any action to corect this will only hinder linux overall because it'll scare companies away. While I mention china, there are also amarican companies that develop and release successful products that use Linux ( for example, Android tablets and routers) and then fail to comply with the basic license agreements involved. In any case, If you compare Linux and most other Open source products to the closed source alternative you wind up with a reliable comparison of the Open source product being a Reliant Robin, and then the Closed source product could be a Lamborghini Diablo. [/list]
        Not sure whether I quite understand your point here, but I know GPL compliance is really bad. Vendors try to steal stuff and to get away with it. And because lack of funds etc, it's not being properly enforced (in some of the cases it's also devs not bothering).
        Last edited by oliver; 06-11-2012, 04:51 AM.

        Comment


        • And now i'll stop responding to all the fud and reading this thread filled up an entire page! So this is the last one

          Originally posted by DeiF View Post
          For me it has to be interactivity:


          Run a process that needs more RAM than available, and as soon as the process starts using swap everything becomes unresponsive (not just the process). If you have patience you can kill the process by ssh'ing into the box, but it usually takes like half an hour to do so.

          Have a CIFS mount, and as soon as the mounted device becomes slow or stops sending packets, your system will become unresponsive too (GUI, any terminal window), even if you aren't accessing any files from it. You can't even unmount the device.

          Or just click or right-click anywhere. Open windows. Click on menus, etc. Everything reacts slowly. You can verify this filming the screen with a videocamera, then later measure delays by analyzing the video.
          You can even see the repaint of every window, and how the painting isn't uniform (i.e. frames, part of the content, etc. everything painting at different times).

          Or compare the time to launch an app like Kcalc with the time to launch the calculator from Windows XP.
          Or compare the time to launch a native app in Linux vs the same app ported to Windows but launched from Linux through Wine.


          Then go to benchmarking sites like this one and cry because interactivity isn't benchmarked and nobody seems to care.
          Also benchmarking clusters of ARM computers, but no benchmarks of 6 year old PCs. WTF.
          You raise truly good points! That I have encountered myself also. As for your clicking and Kcalc comparison, I blame QT for being a behemoth. calculator under Ubuntu (thus gtk based) runs fast. I will agree though, that Firefox for example has been benchmarked to start faster under wine. Preloading missing in ubuntu maybe?

          As for your benchmarking stuff, that's more against phoronix isn't it? But yes. I do miss my some form of comparison to my IBM T42 (with some minor upgrades), which is 7 years old, runs like an ace, even though it's extremely old! That T60 I believe Micheal has, isn't THAT old yet

          Comment


          • Originally posted by oliver View Post
            There definitely needs to be pressure to keep to standards. But I think Linux is your best bet, when it comes to standards and keeping to them. I'm sure we all haven't forgotten w3c, ie and html to just name something.
            Thanks Olivers. You wont get me returning to Windows. Never. I've totally finished with that crap.

            Comment


            • Originally posted by birdie View Post
              This: Why Linux is not (yet) Ready for the Desktop (a.k.a. Linux problems), 2012 edition

              There's one problem no amount of money can solve: for most Open Source developers Linux is a playground, a thing they don't care beyond their aspirations, thus we have constantly broken features and API breakage every odd moon cycle. With such an attitude there's no way Linux will ever attract a big number of serious ISVs. Of course, people will be quick to point that already available Open Source software is enough for everyone - but that's a serious myopia. No, it's not enough, very very far from that.
              This is not true when comes to Ubuntu and RHEL. API breakage bullshit while you have 5 years support with Ubuntu LTS... Thankfully Valve proved you wrong.

              Comment


              • Re

                Originally posted by oliver View Post
                As for your clicking and Kcalc comparison, I blame QT for being a behemoth. calculator under Ubuntu (thus gtk based) runs fast.
                I just launched KCalc under Kubuntu and it launches instantly... Btw, the GTK calculator is just awful. Press "." a few times... Now press "." in KCalc...
                I don't like when people lie just because they hate on something(in this case KDE or/and Qt).

                Comment


                • Originally posted by birdie View Post
                  The mentioned problems are getting fixed all the time the problem is that I see no end in sight. Besides, if something doesn't work in Linux, people won't f*cking care whose fault it is, "It works in Windows/MacOS/whatever - Linux sucks", and they are right.
                  When something doesn't work on OS X or Windows people say they sicks and they're right. If you have nothing smart to say just don't say stupid things. Usually third party members drivers were causing trouble, but it's much better now, but not perfect yet.

                  Comment


                  • Originally posted by Poliander View Post
                    The Linux ecosystem is far too shattered to build something that is able to compete with MacOS or Windows in terms of "general usability". Look at all those different desktop environments, window managers, package formats, audio and input subsystems, distributions with different look-and-feels, license and copyright implications and restrictions, unstable programming interfaces... I'm convinced that Linux would get far more support from end-users, software donators and hardware vendors if the whole thing wouldn't be so directionless, volatile and hostile.

                    (Of course, this diversity is also a kind of a strength. Most Linux users are well aware of those strenghts and know how to benfit from the freedom-of-choice. But these are the two sides of the same coin...)
                    I always wonder why some people make same mistakes all the time. Linux is just a kernel and if you want compare something to Windows or OS X you have to compare Linux distributions. Ubuntu is not fragmented, it has a single DE, package format etc. Vendors just should focus on Ubuntu and problems solved. It seems they're even going to do so - Valve, EA. Most of the comments here are just bunch of bull, because they don't understand such simple thing.

                    Comment


                    • Originally posted by RussianNeuroMancer View Post
                      Link for people who not aware about birdie trolling: http://phoronix.com/forums/showthrea...C-Vendor/page2
                      I was sure I remember this troll from somewhere. Ubuntu just kills Windows and OS X in most aspects and few others are third party dependent.

                      Comment


                      • Originally posted by birdie View Post
                        Linux constantly evolves? OMG. You even boast about that.

                        Windows evolves, but APIs and ABIs are rock solid. On Windows 8 I can run software written for ... Windows 3.1 which was released over 25 years ago. Try this feat with Linux software.
                        And that's Windows is damn vulnerable. It's not a mystery there are security holes from a DOS era! This makes your APIs and ABIs bunch of crap.

                        Worthless points? Great, go and convince a single company to port its applications from Windows to Linux. Whereas you are theorizing, I deal with big ISVs and I know what they want from Linux. But you are free not to agree with me, just forget about Linux having more than 3% on the desktop.
                        You're simply dumb. What's Linux missing are just games and some software. That's all. Games are coming and there's more and more software as well. Like I said before Valve proved you wrong.

                        Comment


                        • Originally posted by birdie View Post
                          Yep, that's the case. But given today's storage sizes, it's not a problem at all.

                          In Windows 7 64 under winsxs:

                          Total DLLs: 6828
                          Unique DLLs: 3604
                          Don't forget to count dozens of vb runtimes and .Net versions. It makes it a bunch of bloated and security vulnerable mess.

                          Comment


                          • Originally posted by elanthis View Post
                            This is amazing. Thank you for this link, it'll save me a lot of time explaining to people why we don't waste time porting games to Linux.
                            I bet nobody would like to play your stupid console games.

                            There's that, plus the complete lack of real QA. The Microsoft Azure QA team alone is larger than the entire developer and tester pool of X.org, Mesa, GTK, GNOME, and the DRI/DRM bits of the kernel. You would crap yourself if you had any idea of the size of the team responsible for just DirectX.
                            That a real bullshit. I explained you many times there's real QA in Ubuntu, so stop lying. I also explained you when comes to manpower MS lost years ago.

                            FOSS, for all the talk about it being open to anyone and having "millions of eyeballs," simply does not have a very large developer pool, and has an even smaller "support staff" pool. Real professionals with actual skill/talent are doing their work for pay. The good ones are doing it for a lot of pay. They do not have free time after working 40-60 hours/week to hack on a hobbyist OS or software, and even if they did they probably want to spend their non-working hours doing something other than slaving away on more software. Take that fact along with the fact that the largest FOSS companies have teeny tiny profits compared to even the run of the mill proprietary software companies (Red Hat's recent $1b _gross revenue_ is about the same as the _net profit_ of some of the smaller well-known software companies), and you have the cause for the practically barren developer pool on the important FOSS projects, and the reason why the folks working on those projects keep complaining about being so seriously under-manned.
                            What a stupid troll you are. The most of the Linux and its ecosystem developers are being paid.

                            Comment


                            • Originally posted by elanthis View Post
                              The actual stats show that the only people leaving Windows are the people going to Macs. Of course, they also show that a lot of the hardcore Linux users of 1990's and early 2000's also moved to Macs. Linux was at 1% in 1999. It's at 1% now. Projecting historical evidence indicates that it'll still be at 1% in 2025.

                              Ignoring your bunch of bull it's enough to say OS X lives only, because of marketing. It's aiming at desktops since beginning and it's still very niche OS. MS succeeded, because there wasn't real competition in the first years and then they had monopoly. When good software will come to Linux then only idiots will use Windows. In 2025 there will be MS Linux.

                              I actually worked for a very large government installation for years, doing in-house software development and some light sysadmin work on the Linux server farm. This was back in my Linux fanboy days. It is actually one of the larger catalysts that caused me to turn from a "Linux is the future" proponent like yourself into a "Linux is a nice Web server OS, but thank God there's someone I can give money to in exchange for a less frustrating desktop experience" believer.
                              No, you turned to MS fanboy and you keep lying all the time when there's some article about Linux. Only idiots wants MS to rule, so you're an idiot. Linux is far more usable and better in many aspects, and thanks God it exists.

                              Government jobs will best illustrate for you just how great Windows is for idiots who can't tell their assholes from floppy drives and how awful Linux is for "idiots" who can't figure out how to read unified diff files generated by dpkg when foobar-1.7.2b changes config file compatibility with foobar-1.7.2a.
                              That's sad, because you just proved idiots prefer Windows (not noobs, but idiots).
                              Last edited by kraftman; 06-11-2012, 05:59 AM.

                              Comment


                              • ATI & nVidia GFX driver is to my pov a big issue.
                                Beside that I'm using ArchLinux as my day to day office work and it's a great source of knowledge.
                                Whereelse can you find such a precise documentation like Kernel/Doc.txt, Manuals, Code sources for free.
                                Remember during the ms-dos age, you may have to buy books to understand psp, mcb, int 21h ...

                                Comment

                                Working...
                                X