Announcement

Collapse
No announcement yet.

Linux is not ready for Steam

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • darkphoenix22
    started a topic Linux is not ready for Steam

    Linux is not ready for Steam

    I think Valve should wait a year before porting Steam. The multimedia and audio frameworks and APIs on Linux are currently a mess. The APIs and frameworks need to be stablized before any sort of large scale professional game development can begin on Linux.

    http://braid-game.com/news/?p=364

    If Value ports Steam to Linux in its current state, I'm afraid it will become unprofitable and they'll quickly abandon it. If that happened, there would be virtually no professional Linux games for years.

    Linux needs to be ready before we fall under the spotlight.

  • darkphoenix22
    replied
    I agree. There needs to be a much larger emphasis on binary compatibility in the Linux community. Major API/ABI changes are fine, just make them once a year. :P

    Leave a comment:


  • Dragonlord
    replied
    The problem is not bug fixing. Minor revisions should only change the "function" of a library but not the "interface" or "contract" (aka design by contract). The main problem is that some libraries break the API on minor revisions. Granted C++ is crap when it gets to libraries. It totally lacks encapsulation. Even adding a single private class member can totally kill the ABI and cause apps to crash.

    Leave a comment:


  • darkphoenix22
    replied
    Originally posted by kazetsukai View Post
    This would solve alot of problems while slowing development. Newer libraries catch bugs because the libraries are immediately put into use, so library development would suffer and quality control would go down.
    I think just having year-long development cycles with binary compatible bug fixes releases (ala service packs) would be enough to fix this problem.

    There is no really good solution to this without changing the core foundation of how libraries work. Perhaps if libraries had the ability to identify what functionality was present in them with no other information to the main binary this could be solved much more easily.

    Think about it- we could throw away other package management solutions entirely. The application binary would tell the "lib fetcher" what library functions it needed, and the libraries themselves would report this information to the repository or "lib server". Rather than downloading packages of libraries, you'd download individual libraries or source tarballs which provided them and compile those on the fly.
    So basically, manifest files with multiple versions of libraries sitting in RAM and on the HD. I think this would cause more problems then it would fix, not to mention adding a layer of complexity and making these libraries harder to test.

    I think the package management systems we have currently are fine. We just just need to slow down development to allow for bug fixes and stabilizing.

    Leave a comment:


  • kazetsukai
    replied
    Originally posted by darkphoenix22 View Post
    Though the quick pace of the development of the core libraries is making library issues on Linux more of a problem than they should be. I believe that we should slow the development of the core libraries to yearly schedules, which should make it MUCH easier for closed source development. As an example, upgrading GNOME to a major and often binary incompatible new release every 6 months is madness. There's barely any time for bug fixing. Same with the development of the kernel and X11.

    It would do a lot of good if the distributions synced on a yearly schedule.
    This would solve alot of problems while slowing development. Newer libraries catch bugs because the libraries are immediately put into use, so library development would suffer and quality control would go down.

    I was grinding my teeth writing the last post... there are plenty of Closed Source apps which have "no way out" of the library problem. Three words:

    I. Hate. Oracle.

    There is no really good solution to this without changing the core foundation of how libraries work. Perhaps if libraries had the ability to identify what functionality was present in them with no other information to the main binary this could be solved much more easily.

    Think about it- we could throw away other package management solutions entirely. The application binary would tell the "lib fetcher" what library functions it needed, and the libraries themselves would report this information to the repository or "lib server". Rather than downloading packages of libraries, you'd download individual libraries or source tarballs which provided them and compile those on the fly.

    Application: "I require these library functions." => Lib fetch process
    Libraries: "We provide the following library functions." => Lib server process

    Leave a comment:


  • darkphoenix22
    replied
    This is getting off-topic but eh:

    Definitely. I still remember the pain of installing KDE 3.0 on Mandrake 8.2 with just plain RPM.

    But the ability to recompile the applications to use the newer libraries (or make changes to make them compatible) and the complex and comprehensive package management systems used by most modern Linux distributions and the BSDs go a long to alleviate the pain of dependency hell.

    The closed source operating systems don't have nearly as much flexibility in this regard. This isn't a closed source problem, it's more of a symptom.

    -----

    Though the quick pace of the development of the core libraries is making library issues on Linux more of a problem than they should be. I believe that we should slow the development of the core libraries to yearly schedules, which should make it MUCH easier for closed source development. As an example, upgrading GNOME to a major and often binary incompatible new release every 6 months is madness. There's barely any time for bug fixing. Same with the development of the kernel and X11.

    It would do a lot of good if the distributions synced on a yearly schedule.

    Leave a comment:


  • kazetsukai
    replied
    Originally posted by darkphoenix22 View Post
    This is a closed-sourced problem TBH.
    The source model has nothing to do with this; library hell exists with Open Source applications as well. Remember when we changed over from kernel 2.4 to 2.6, and almost simultaneously changed GCC versions as well?

    Originally posted by darkphoenix22 View Post
    Newer libraries may be binary incompatible with the games, so closed source developers are forced to bundle their libraries with their programs.
    Also not specific to games either... any application can become victim of a library upgrade gone wrong. Generally this is why we have minor and major version changes.

    Library hell exists on Linux really as much as any other system, but distribution maintainers know that throwing every version of a library onto a box to maintain compatibility is really bad practice. Pointing the finger at the closed source apps is wrong and solves nothing.

    Leave a comment:


  • Svartalf
    replied
    Originally posted by Dragonlord View Post
    Let's rephrase it and say it "should" not be like this. Granted I've seen a bunch of libraries which are coded in an un-godly way. I don't think though that OpenAL for example would be one of them. It's a pure C library interface if I'm not mistaken. Stuff like libpng you can though compile directly into the binary. Less problems than having a bunch of libs trailing behind you no matter where they are located.
    Let's just say you're going to find that it doesn't always work out the way you envision... OpenAL might be something you'd expect to work right- but...yeah, it's a C lib, with consistent ABI edges, but...

    If you look at the range of what was shipped until very recently, you'd find that OpenAL SI was shipping until about 1-2 years ago in most distributions. OpenAL SI had developed serious bitrot and was unreliable for game use. So, what do you do? Do you do what you propose here or do you do a ./libs dir that can allow you to have the users remove a selected .so that's been vetted for broad use? If you did what you proposed, you'd have issues down the line as it doesn't play nice.

    For libpng12, if you linked against the binary, you'd have issues if your code used a wrapper like SDL_image. Linking against it would be a potential liability as it might use differing edges of the libraries it uses.

    You need to be cautious and think things through there. What you talk to is the ideal- but it'll never quite happen all the time. And it's no different in every environment other than the consoles; where there's this largely constrained world with exacting control on firmware revs, etc.

    Leave a comment:


  • darkphoenix22
    replied
    This is a closed-sourced problem TBH.

    Newer libraries may be binary incompatible with the games, so closed source developers are forced to bundle their libraries with their programs.

    This is problem on WIndows as well, where nearly all major applications statically link their libraries to avoid DLL hell. To mediate this, Microsoft also includes multiple versions of the libraries in the WinSxS folder.

    The need to run multiple versions of each library to maintain compatibility with applications is the main reason why WIndows and Mac OS X take up so much RAM compared to an install of Linux, where single versions of dynamic libraries installed and updated by package managers are more common place.

    Leave a comment:


  • Dragonlord
    replied
    Let's rephrase it and say it "should" not be like this. Granted I've seen a bunch of libraries which are coded in an un-godly way. I don't think though that OpenAL for example would be one of them. It's a pure C library interface if I'm not mistaken. Stuff like libpng you can though compile directly into the binary. Less problems than having a bunch of libs trailing behind you no matter where they are located.

    Leave a comment:

Working...
X