Announcement

Collapse
No announcement yet.

Linux is not ready for Steam

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Dragonlord View Post
    Shipping such libs with a game is anyways big time bull-crap.
    You try shipping something linked against libpng12 sometime and you might just change that tune just a smidge- Arch Linux users will be complaining at you over it, just for starters. I'd gotten bitten by that one in recent times along with a few other gems like it.

    Not all libs play nicely the way you'd expect them to, Dragonlord. To normalize things, you DO have to, unfortunately provide select .so's to assure things are there in the versions you expect them to be. You can more safely link against things like libc and similar- but unless you're part of the distribution directly, it can be very problematic to expect everything except the game's binary itself will be provided by the distribution. Missing libs, incompatible libs... It's a quagmire if you don't ship a few select .so's. I'm suspecting OpenAL-Soft gets bundled with the title because it can NOT be presumed it's on a given distribution install unless there's already been a game that needed it having been installed. Trying to get the users to resolve their dependencies is...entertaining... So you, much to most's dismay, do that "bull-crap" thing to ensure availability and compatibility of key libs you can't be assured will be there for you.

    Comment


    • Let's rephrase it and say it "should" not be like this. Granted I've seen a bunch of libraries which are coded in an un-godly way. I don't think though that OpenAL for example would be one of them. It's a pure C library interface if I'm not mistaken. Stuff like libpng you can though compile directly into the binary. Less problems than having a bunch of libs trailing behind you no matter where they are located.

      Comment


      • This is a closed-sourced problem TBH.

        Newer libraries may be binary incompatible with the games, so closed source developers are forced to bundle their libraries with their programs.

        This is problem on WIndows as well, where nearly all major applications statically link their libraries to avoid DLL hell. To mediate this, Microsoft also includes multiple versions of the libraries in the WinSxS folder.

        The need to run multiple versions of each library to maintain compatibility with applications is the main reason why WIndows and Mac OS X take up so much RAM compared to an install of Linux, where single versions of dynamic libraries installed and updated by package managers are more common place.

        Comment


        • Originally posted by Dragonlord View Post
          Let's rephrase it and say it "should" not be like this. Granted I've seen a bunch of libraries which are coded in an un-godly way. I don't think though that OpenAL for example would be one of them. It's a pure C library interface if I'm not mistaken. Stuff like libpng you can though compile directly into the binary. Less problems than having a bunch of libs trailing behind you no matter where they are located.
          Let's just say you're going to find that it doesn't always work out the way you envision... OpenAL might be something you'd expect to work right- but...yeah, it's a C lib, with consistent ABI edges, but...

          If you look at the range of what was shipped until very recently, you'd find that OpenAL SI was shipping until about 1-2 years ago in most distributions. OpenAL SI had developed serious bitrot and was unreliable for game use. So, what do you do? Do you do what you propose here or do you do a ./libs dir that can allow you to have the users remove a selected .so that's been vetted for broad use? If you did what you proposed, you'd have issues down the line as it doesn't play nice.

          For libpng12, if you linked against the binary, you'd have issues if your code used a wrapper like SDL_image. Linking against it would be a potential liability as it might use differing edges of the libraries it uses.

          You need to be cautious and think things through there. What you talk to is the ideal- but it'll never quite happen all the time. And it's no different in every environment other than the consoles; where there's this largely constrained world with exacting control on firmware revs, etc.

          Comment


          • Originally posted by darkphoenix22 View Post
            This is a closed-sourced problem TBH.
            The source model has nothing to do with this; library hell exists with Open Source applications as well. Remember when we changed over from kernel 2.4 to 2.6, and almost simultaneously changed GCC versions as well?

            Originally posted by darkphoenix22 View Post
            Newer libraries may be binary incompatible with the games, so closed source developers are forced to bundle their libraries with their programs.
            Also not specific to games either... any application can become victim of a library upgrade gone wrong. Generally this is why we have minor and major version changes.

            Library hell exists on Linux really as much as any other system, but distribution maintainers know that throwing every version of a library onto a box to maintain compatibility is really bad practice. Pointing the finger at the closed source apps is wrong and solves nothing.

            Comment


            • This is getting off-topic but eh:

              Definitely. I still remember the pain of installing KDE 3.0 on Mandrake 8.2 with just plain RPM.

              But the ability to recompile the applications to use the newer libraries (or make changes to make them compatible) and the complex and comprehensive package management systems used by most modern Linux distributions and the BSDs go a long to alleviate the pain of dependency hell.

              The closed source operating systems don't have nearly as much flexibility in this regard. This isn't a closed source problem, it's more of a symptom.

              -----

              Though the quick pace of the development of the core libraries is making library issues on Linux more of a problem than they should be. I believe that we should slow the development of the core libraries to yearly schedules, which should make it MUCH easier for closed source development. As an example, upgrading GNOME to a major and often binary incompatible new release every 6 months is madness. There's barely any time for bug fixing. Same with the development of the kernel and X11.

              It would do a lot of good if the distributions synced on a yearly schedule.

              Comment


              • Originally posted by darkphoenix22 View Post
                Though the quick pace of the development of the core libraries is making library issues on Linux more of a problem than they should be. I believe that we should slow the development of the core libraries to yearly schedules, which should make it MUCH easier for closed source development. As an example, upgrading GNOME to a major and often binary incompatible new release every 6 months is madness. There's barely any time for bug fixing. Same with the development of the kernel and X11.

                It would do a lot of good if the distributions synced on a yearly schedule.
                This would solve alot of problems while slowing development. Newer libraries catch bugs because the libraries are immediately put into use, so library development would suffer and quality control would go down.

                I was grinding my teeth writing the last post... there are plenty of Closed Source apps which have "no way out" of the library problem. Three words:

                I. Hate. Oracle.

                There is no really good solution to this without changing the core foundation of how libraries work. Perhaps if libraries had the ability to identify what functionality was present in them with no other information to the main binary this could be solved much more easily.

                Think about it- we could throw away other package management solutions entirely. The application binary would tell the "lib fetcher" what library functions it needed, and the libraries themselves would report this information to the repository or "lib server". Rather than downloading packages of libraries, you'd download individual libraries or source tarballs which provided them and compile those on the fly.

                Application: "I require these library functions." => Lib fetch process
                Libraries: "We provide the following library functions." => Lib server process

                Comment


                • Originally posted by kazetsukai View Post
                  This would solve alot of problems while slowing development. Newer libraries catch bugs because the libraries are immediately put into use, so library development would suffer and quality control would go down.
                  I think just having year-long development cycles with binary compatible bug fixes releases (ala service packs) would be enough to fix this problem.

                  There is no really good solution to this without changing the core foundation of how libraries work. Perhaps if libraries had the ability to identify what functionality was present in them with no other information to the main binary this could be solved much more easily.

                  Think about it- we could throw away other package management solutions entirely. The application binary would tell the "lib fetcher" what library functions it needed, and the libraries themselves would report this information to the repository or "lib server". Rather than downloading packages of libraries, you'd download individual libraries or source tarballs which provided them and compile those on the fly.
                  So basically, manifest files with multiple versions of libraries sitting in RAM and on the HD. I think this would cause more problems then it would fix, not to mention adding a layer of complexity and making these libraries harder to test.

                  I think the package management systems we have currently are fine. We just just need to slow down development to allow for bug fixes and stabilizing.

                  Comment


                  • The problem is not bug fixing. Minor revisions should only change the "function" of a library but not the "interface" or "contract" (aka design by contract). The main problem is that some libraries break the API on minor revisions. Granted C++ is crap when it gets to libraries. It totally lacks encapsulation. Even adding a single private class member can totally kill the ABI and cause apps to crash.

                    Comment


                    • I agree. There needs to be a much larger emphasis on binary compatibility in the Linux community. Major API/ABI changes are fine, just make them once a year. :P

                      Comment

                      Working...
                      X