Announcement

Collapse
No announcement yet.

Why Linux Appears Fragmented:

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • ethana2
    started a topic Why Linux Appears Fragmented:

    Why Linux Appears Fragmented:

    http://images.cheezburger.com/comple...7850293860.png

    ..and then game publishers use 'fragmentation' as an excuse to give us .run, .bin, .tar.gz, .sh, .tar.bz2 and .py files instead of actual proper .deb's, which irritates me to no end.

    I want this question answered with hard statistical facts, and the only way to do it is to file bugs on every browser package in every linux based desktop that doesn't identify itself and its host correctly and with sufficient detail. I'm not going to set about doing that without help. Who's with me?

  • pingufunkybeat
    replied
    Originally posted by prophet5
    This would also mean that all updates would need to be backward compatible with its older versions.
    There, you've answered your own question

    Leave a comment:


  • prophet5
    replied
    You'll find that you're going to have that with any distribution. Even Sabayon will eventually have that "forced upgrade" thing as you get nifty new functionality. Happens with Windows. Happens with MacOS as well. Now, perhaps the rate at which it happens might be slower with Sabayon- but it will STILL happen.
    Too true, but at least the upgrade is only if I want the nifty new functionality ie if my system at that time supports it. My experience with K/Ubuntu was not so kind where the repo for the version I was running at the time came to an end therefore needed a distro upgrade (which didn't work and ended in a new download of an iso) to a system that my hardware at the time could not handle with any real performance. With Sabayon, even though it uses entropy as a package manager, I don't have to upgrade when it switches branches. I can still use portage for any apps I may want to install and complile against the kernel and drivers that work for my hardware.

    In the end, your idea, while it seems "obvious", isn't such a hot idea after all.
    Well...at least there's a reason

    Leave a comment:


  • Svartalf
    replied
    Originally posted by prophet5 View Post
    Svartalf, At risk of going off topic a bit, I'd like to ask you a question.
    Nary a problem.

    I've moved away from K/Ubuntu to Sabayon because I liked the rolling release nature of a Gentoo based distro and hated the "Forced Upgrade" nature of Ubuntu (ie. once the repos for your version are no longer supported, you have to upgrade the whole system, which may or may not work), but I'm not by any stretch of the imagination a linux wiz. I'm just an end user with enough know how to create a symlink, which I've had to do on occasion to statisfy versioning requirements of one or two programs
    You'll find that you're going to have that with any distribution. Even Sabayon will eventually have that "forced upgrade" thing as you get nifty new functionality. Happens with Windows. Happens with MacOS as well. Now, perhaps the rate at which it happens might be slower with Sabayon- but it will STILL happen.

    My question is, why bother using versioning? I mean yes update lib's and such, and by all means refer to the version number in tarballs and package managers, but why not use the same name once installed eg 'libxyz1.2' -> 'libxyz1.3' when installed would just be 'libxyz'. Would this not solve the compatability issue when a program is asking for a particular lib but can only find the updated version? This would also mean that all updates would need to be backward compatible with its older versions.
    1) Versioning avoids DLL HELL. There is no versioning whatsoever with DLLs.

    2) How do you know which is which when there happens to be improvements in the API as well as extra edges added in without versioning? A change from 1.0 to 2.0 may well radically change things. A change from 1.2 to 1.4 may add an extra function or change how a current function goes about doing things. Just using the same name on the library isn't a good idea- if you presume a 1.4 edge and 1.2 had slightly differing rules for usage, you can end up with a crashing app- and no idea as to WHY.

    In the end, your idea, while it seems "obvious", isn't such a hot idea after all.

    Leave a comment:


  • prophet5
    replied
    Svartalf, At risk of going off topic a bit, I'd like to ask you a question.

    I've moved away from K/Ubuntu to Sabayon because I liked the rolling release nature of a Gentoo based distro and hated the "Forced Upgrade" nature of Ubuntu (ie. once the repos for your version are no longer supported, you have to upgrade the whole system, which may or may not work), but I'm not by any stretch of the imagination a linux wiz. I'm just an end user with enough know how to create a symlink, which I've had to do on occasion to statisfy versioning requirements of one or two programs

    My question is, why bother using versioning? I mean yes update lib's and such, and by all means refer to the version number in tarballs and package managers, but why not use the same name once installed eg 'libxyz1.2' -> 'libxyz1.3' when installed would just be 'libxyz'. Would this not solve the compatability issue when a program is asking for a particular lib but can only find the updated version? This would also mean that all updates would need to be backward compatible with its older versions.

    Leave a comment:


  • Svartalf
    replied
    Originally posted by nanonyme View Post
    Hence my initial rant about binary packages coming from upstream being a terrible idea because upstream by all likelihood doesn't want to keep porting the program to newest libraries especially if it's the normal kind of a commercial game where there's an initial fee after which the game maker can move on to the next product. Some idealistic opensource coder might be doing this forever but it's imo completely unrealistic to expect this from commercial companies since it stops being cost-efficient after a certain time.
    Indeed. And this is part of the reason why the idea of LSB is nice, but it's execution will be a nightmare somewhere, somehow. I'm side-stepping the issue by pushing select managed .so's that are known to work fine with everything distributed at the time and use an RPATH to point the application (in this case, a game..) to the custom, controlled .so's. Having the LSB versioning, the way things have went for the last 5-7 years would have you keeping all sorts of junk lying around- JUST to support "compatibility". If you want the honest truth, it's little different on the Windows side of things, and actually faintly worse.

    Leave a comment:


  • nanonyme
    replied
    Originally posted by Svartalf View Post
    libpng12. It's on the current crop of distributions, but NOT on Arch and a few other of the latest. They've went to libpng14. You don't want the old stuff lurking around for a LONG time, so they removed it on some of the distributions and deprecated it (i.e. compat package...).
    Hence my initial rant about binary packages coming from upstream being a terrible idea because upstream by all likelihood doesn't want to keep porting the program to newest libraries especially if it's the normal kind of a commercial game where there's an initial fee after which the game maker can move on to the next product. Some idealistic opensource coder might be doing this forever but it's imo completely unrealistic to expect this from commercial companies since it stops being cost-efficient after a certain time.

    Leave a comment:


  • Remco
    replied
    Originally posted by Svartalf View Post
    Actually, what you're proposing isn't any better. I'll get to that in a moment...



    Actually no, it won't. You didn't catch the thing I was telling everyone when I posted earlier on the subject.

    libpng12. It's on the current crop of distributions, but NOT on Arch and a few other of the latest. They've went to libpng14. You don't want the old stuff lurking around for a LONG time, so they removed it on some of the distributions and deprecated it (i.e. compat package...). Caster 1.1 (which is what you'd get when you buy right at the moment...) will install and NOT work on the latest Arch and a few others without you compiling or installing an unsupported "compat" package. In the LSB story you're saying, you'd have to have a RAFTLOAD of stuff lying about just to support this and that. When 2.0 ships, I'll be forcing SDL_image to not dlopen the png and jpeg libs and load them like any other .so by normal linkage- and then forcing it to use libpng14 in my ./libs dir to avoid this problem. LSB wouldn't FIX this. Your idea of LSB might fix it with a lot of other issues (i.e. some of the stuff won't live nicely together on the same system...). RPM doesn't fix this. Directory structures doesn't fix this. Specifying a library doesn't fix this unless we have a lot more consistent versioning in about 1/3 or so of the libraries in the Linux system space.

    LSB is a solution by someone who didn't grok what the problem really was- it's a partial solution and a clumsy one at that.
    The obvious solution is to keep libpng12 around in the repositories for as long as you support the LSB version for which libpng12 is required. It requires consistent naming and versioning, yes. Consistency like this is achieved with a comprehensive LSB spec, and evangelizing the goals of the LSB to upstream projects. The absence of such consistency is a serious bug, which makes life difficult for distribution developers and independent software vendors alike.

    Maybe it would be a good idea to start a bit smaller. For example, Debian and Ubuntu tried to decide on common versions of certain important packages for their next releases. This didn't succeed, but I think they need to keep trying. A widely implemented LSB won't be built in one day.

    Leave a comment:


  • Svartalf
    replied
    Originally posted by Remco View Post
    I don't believe that the LSB should specify which packages should be installed. That's way too inflexible. It should just do what distributions do: specify dependencies for each package. This is why it's important to settle the package format debate.
    Actually, what you're proposing isn't any better. I'll get to that in a moment...

    So, basically what happens is this: the LSB specifies the package versions that Linux distributions have to offer in 2011. Then, Debian Unstable will create the biggest repository in existence, and all distributions will derive from that (but that's not required). Fedora will have more bleeding edge stuff added to it, and all is good.

    Then comes along Maya 2012, which is an LSB-2011 compliant package. It will work just like a normal distribution package, except that it won't depend on the bleeding edge stuff from Fedora for example. It will still just install everything it needs, and all dependencies will be there.
    Actually no, it won't. You didn't catch the thing I was telling everyone when I posted earlier on the subject.

    libpng12. It's on the current crop of distributions, but NOT on Arch and a few other of the latest. They've went to libpng14. You don't want the old stuff lurking around for a LONG time, so they removed it on some of the distributions and deprecated it (i.e. compat package...). Caster 1.1 (which is what you'd get when you buy right at the moment...) will install and NOT work on the latest Arch and a few others without you compiling or installing an unsupported "compat" package. In the LSB story you're saying, you'd have to have a RAFTLOAD of stuff lying about just to support this and that. When 2.0 ships, I'll be forcing SDL_image to not dlopen the png and jpeg libs and load them like any other .so by normal linkage- and then forcing it to use libpng14 in my ./libs dir to avoid this problem. LSB wouldn't FIX this. Your idea of LSB might fix it with a lot of other issues (i.e. some of the stuff won't live nicely together on the same system...). RPM doesn't fix this. Directory structures doesn't fix this. Specifying a library doesn't fix this unless we have a lot more consistent versioning in about 1/3 or so of the libraries in the Linux system space.

    LSB is a solution by someone who didn't grok what the problem really was- it's a partial solution and a clumsy one at that.

    Leave a comment:


  • Svartalf
    replied
    Originally posted by energyman View Post
    ubuntu is completely irrelevant in that regard.

    RHEL and SLED/SLES are the important distros out there. They use rpm. Problem solved. It is not their fault that ubuntu sucks.
    Excuse me...

    1) Using terms like "sucks" is BS and fanboy-ish. DROP IT.

    2) RHEL and SLED/SLES are TWO of the important distros out there. But, unfortunately for YOU, they're not the only ones. Amongst the others is Debian, Ubuntu, Mandriva, MontaVista, Angstrom, MeeGo, and Android (yes). As an observation, only PART of those distributions use RPM. RPM isn't that great a packaging system. It's a good answer for servers and maybe desktops, but it's BLOATED for things like mobile devices.

    RPM doesn't fix the problems I mentioned in the slightest. So, problem NOT solved.

    Leave a comment:

Working...
X