Announcement

Collapse
No announcement yet.

Third-party software installation for any Linux distribution

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • deanjo
    replied
    Originally posted by elanthis View Post
    Are those packages on the Platinum DVD that Joe User gets when he buys NWN? If they were, would those DVDs magically update with the newer packages when new distros come out? Or is the user saddled with the task of finding the correct installer for his flavor of Linux?
    Adding the repo is all that is needed. No different then having to hunt down a windows game patch. The installer could be as simple as a bash script that detects the distro version and downloads the appropriate rpms for their distro. Not exactly rocket scientist stuff.


    Are your buddies non-uber-nerds? If so, did they get this installed with no help or even mention of the packages' existence by yourself?
    Actually a couple of them are "non-uber nerds". If I had access to biowares ftp setting up a repo wouldn't be hard.

    Leave a comment:


  • Thetargos
    replied
    I do see the points you are making... I do notice what you say about me resorting to doing things at the shell rather than in GUI (or a shell inside a GUI). It so happens that my WHOLE household and my dad's computer run on Linux. Even my wife (who isn't a computer-lab-rat) runs Linux with very little intervention of my part. However, when it comes to installing cryptic shell-script based applications, sure enough, she gives me a call.

    Which is precisely why I do believe that rather than having everything localized to a certain group of distributions, it should be done in upstream and yes, make an installation procedure (rather than an actual installer) have PackageKit (which in the end was one of its goals) handle the actual installation.

    Maybe I'm being too simplistic, but the way I see it is: There any number of Linux distributions (Ubuntu, Mint, Knoppix, PCLOS), of that number they usually are based off one of the big-name distributions (Red Hat, SuSE, Mandriva, Debian) and even their offspring give way to new distributions. HOWEVER, usually infrastructural packages and mechanisms are kept, in that Red Hat offspring (including SuSE and Mandrake/Mandriva) will use not only the RPM package format, but also (to some degree) package naming and location within the filesystem, just like is the case with *buntu and Debian. When that is NOT the case, that is why LSB was created in the first place, so adhering to LSB for such stuff would make life easier for everyone (up to some point).

    By having an upstream project take care of this, the issue of local idiosyncratic mechanisms, problems, barriers is diluted. Companies should only seek to comply with the needs of the project and the individual distributions using it will be the responsibles for managing their local pesky details.

    So as an example:

    Distributor A releases their game for Linux with an install script and a compressed archive with the game assets. Instead of shipping a GUI/TUI installer within the self extracting archive, have the install script check the system for basic necessary infrastructure (PackageKit), parse the installation parameters to the host OS PackageKit.

    PackageKit would in turn verify whether the system has all it needs to install the program (dependency solving) either remotely (from the distro's repositories) or locally (from local media) if no network connection is available. Launch the appropriate GUI for the situation (either TUI if run from the console, GTK+ or Qt if run from within a GNOME/KDE/XFCE/LXDE session, such frontend would be made available by the distribution, of course!). Have a simple mechanism such as XML parsing so that the application can "customize" the host OS installer wizard window to some degree (License agreement, graphics, welcome message, etc). If run as a regular user, ask if the user would like to install locally (in $HOME) or system-wide, which would in turn cause PackageKit to bind to PolicyKit in order to get the appropriate credentials.

    Despite the format of the archive for the game assets, there should be a mechanism to perform two important things:
    1. Make sure the appropriate files and file formats are given so that the application can be registered with the distros package management system (be it RPM or dpkg), which is important so
    2. In order to be able to update the application, optionally, the application archive could include the necessary information for the vendor's updates repositories which could be setup (at the user option) using any of the popular mechanisms. Even support for various distribution versions should be relatively easy.


    I seriously think that overconvolutin the issue is exaggeration. Sure in order to achieve this a consensus must first be reached and the corresponding project launched.

    Leave a comment:


  • elanthis
    replied
    Originally posted by deanjo View Post
    LOL, it isn't a hard job, seriously. I'll use NWN Diamond edition for an example. I have long abandoned their antiquated installer and have used my local build service to maintain installer rpms for Fedora and openSUSE on it with profiles for every version of openSUSE 10.3, Fedora 8, and Mandriva 2008.
    Are those packages on the Platinum DVD that Joe User gets when he buys NWN? If they were, would those DVDs magically update with the newer packages when new distros come out? Or is the user saddled with the task of finding the correct installer for his flavor of Linux?

    My buddies and I have been using them for years with the three different distros.
    Are your buddies non-uber-nerds? If so, did they get this installed with no help or even mention of the packages' existence by yourself?

    Leave a comment:


  • deanjo
    replied
    Originally posted by elanthis View Post
    No, no no no. Hard media's data files may be out of date (although rarely the entire set of files), but the installers just keep working. That's fine. You install the 4GB of data, it says there's updates, you download maybe an extra 20M of patches, and bam you're good.

    The actual binaries rarely break, unless they statically link shit they shouldn't. Linux in general is pretty good about maintaining a stable ABI, actually. Almost every single update-breakage or distro-incompatibility issue is artificially injected into the mix by the distributions' packaging infrastructures. Over and over and over again.

    Which is exactly what happens with RPMs, DPKGs, and .bin installers. You release your DVD in May 2010. By November 2010, half the distros can't install from the DVD anymore. You need to install to get the updater to run to download updates... but you can't install to get an updated installer. Hmm.

    This isn't make believe. This actually happens today, frequently, with a lot of software that is shipped outside of the distro's central repository. Including every single commercial game every shipped for Linux. You're pretty much forced to go online, download a hacked-up community installer, and then use that to get the files off the DVD. It's retarded. There are plenty of Open Source apps where I can find a Fedora 12 i386 RPM or an Ubuntu 9.04 DPKG or whatever, but oddly I seem to be running Fedora 14 x86_64 which is not supported. Sucks to be me, I guess. I need to downgrade to an unsupported OS to get a supported package? Sure, a lot of the time you can get older RPMs to install just fine. Plenty of times, you cannot. All because someone decided to merge a package or split a package or rename a package and now incompatibility is artificially introduced.
    LOL, it isn't a hard job, seriously. I'll use NWN Diamond edition for an example. I have long abandoned their antiquated installer and have used my local build service to maintain installer rpms for Fedora and openSUSE on it with profiles for every version of openSUSE 10.3, Fedora 8, and Mandriva 2008. There was still one nwn patch when I started that was to be finalized plus some other patches like the nwmovies patch. These have all stood up over the years using the build service. The games assets are in a simple tar.bz2 file and items like the community pack and the clients are in their own rpm. Items like the game not working on x64 because of the incompatible distributed sdl libraries are a thing of the past as well as the game is now set to use the distributions version for their particular version. It has worked extremely well over the years. It has even handled items like changing rpm build specifications (switching to lzma, patch rpms, etc) very gracefully with minimal effort. Even the dependencies are taken care of. My buddies and I have been using them for years with the three different distros.

    Leave a comment:


  • elanthis
    replied
    Originally posted by deanjo View Post
    lol, ok you are really making sound ALOT worse then it is. First of all, ANY hard media will have the issue of being "outdated" as bugs are patched and fixed.
    No, no no no. Hard media's data files may be out of date (although rarely the entire set of files), but the installers just keep working. That's fine. You install the 4GB of data, it says there's updates, you download maybe an extra 20M of patches, and bam you're good.

    The actual binaries rarely break, unless they statically link shit they shouldn't. Linux in general is pretty good about maintaining a stable ABI, actually. Almost every single update-breakage or distro-incompatibility issue is artificially injected into the mix by the distributions' packaging infrastructures. Over and over and over again.

    Which is exactly what happens with RPMs, DPKGs, and .bin installers. You release your DVD in May 2010. By November 2010, half the distros can't install from the DVD anymore. You need to install to get the updater to run to download updates... but you can't install to get an updated installer. Hmm.

    This isn't make believe. This actually happens today, frequently, with a lot of software that is shipped outside of the distro's central repository. Including every single commercial game every shipped for Linux. You're pretty much forced to go online, download a hacked-up community installer, and then use that to get the files off the DVD. It's retarded. There are plenty of Open Source apps where I can find a Fedora 12 i386 RPM or an Ubuntu 9.04 DPKG or whatever, but oddly I seem to be running Fedora 14 x86_64 which is not supported. Sucks to be me, I guess. I need to downgrade to an unsupported OS to get a supported package? Sure, a lot of the time you can get older RPMs to install just fine. Plenty of times, you cannot. All because someone decided to merge a package or split a package or rename a package and now incompatibility is artificially introduced.

    Second of all, upgrades are easily handled with such a build system. You also do not need to repackage all the game assets. That part is actually really easy to do as a simple tar.bz2 file can be used for those items. The only thing that has to be distro specific is the installer.
    And when the software is no longer being actively supported, the installers get out of date. See every single commercial Linux game for examples.

    As it is opensource you are free to host it on their beefy mirror infrastructure at no cost.
    Nobody hosts 4GB packages like that yet. They will change their tune, quickly. Debian, Fedora, and so on have had issues with mirrors getting upset over package size increases in even the recent past.

    Plus, that's a half-solution for Open Source software, and a complete non-solution for everyone else. Particularly games, which are very rarely Open Source due to that whole $10m+ development cost they kinda like to recuperate a little.

    Then there's still the problem of distributing those per-distro installers. I go to foo-soft.com's website, and click the big Download link. The browser can already detect if I'm running Windows, OS X, or Linux, and point me to a proper installer for the platform. Well, except for Linux, because it has no freaking clue which of 20 different installers to give to me. The burden of sorting through and finding the proper installer (if it even exists) is offloaded to the user. After already putting the burdern on the distributor to make these multiple installers (even with SUSE Build Service) instead of just being able to run a local "make-installer" tool over a tiny installer script like I can do on Windows with tools like InnoSetup.

    Really your making a mountain out of a molehill.
    So I've packaged for Debian, Arch, and Mandrake, and am currently a packager for Fedora. I also maintain installers for commercial Windows applications. I have supported user bases of over 200 people on heterogeneous networks comprised of multiple versions of Windows and several flavors of "desktop friendly" Linux distributions.

    Guess which machines were a complete and utter bitch to maintain support with software rollouts and which took close to zero effort? (Hint: it's not the Windows ones.)

    The situation is not as bad as it used to be. It's still not perfect.

    I doubt you notice it a lot. I doubt most people on this forum notice it. Every single person here is an uber Linux nerd. When you have to do some crazy Linux thing, you don't even notice anymore. It's normal. You've done it for years. It's well-honed habit. Hell, compiling a kernel is _fun_. Shells are easier and more efficient to use than a GUI. GCC and Vim are an incredibly powerful configuration toolset.

    Try getting your non-uber-nerd friends and family on Linux. See how often you have to go over and do for them the most trivially basic stuff. Things like GNOME having icons in different places than Windows adds enough overhead as it is (and no, I'm not saying that Linux must look and act like Windows; just noting it already has a steep learning curve just by being different, without being harder). As soon as they go to download an app and it asks them if they're running Fosd Core Linux or Debutan Linux or Ubernutu Linux and whether they're running the 36-bit or 68-bit version and then tells them to set the Executive Byte on the file and then open the 1980's black typey-thingy window to run $.foo/bin, they say "fuck this stupid shit, I'm going back to a computer thingy that doesn't require a Master's degree in not-having-a-life."

    The really screwed up part? It was actually easier to install and run Windows games on Linux than to install the Linux native games. If Wine is installed, they literally just had to pop in the CD, double click setup.exe, and bam it installed and ran. The Linux apps? Required me to come over and hack the shell scripts to work around the way the distros decided to change xhost and alter how CD-ROMs are mounted and then manually run the shell-based updater that requires root. That is _retarded_. I recall this so clearly, because of one friend -- a very poor college student, could barely scrape together the $300 it took for me to build his computer without a Windows license -- only wanted to play two games. NWN and Warcraft 3. He got Warcraft 3 installed by himself and needed me to fix NWN to install. He ended up selling off some stuff just to go buy a Windows XP DVD about 3 weeks after we got that computer set up, soon as he realized he couldn't actually patch or update NWN or install modules without me.

    I don't particularly care if the uber-nerds don't see a problem. They never will. Most FOSS programmers don't even care about games at all. Doesn't mean games don't matter. The vast majority of people who own a PC play games on it. Being a FOSS programmer already makes a person an oddball minority. Them being a non-gamer doesn't change that. The oddball minorities can go be oddballs that do things the slow painful way all they want. The rest of us -- be they people who don't want to learn all that crap because they'd rather spend the time playing a game, going outside, or getting laid; or even just nerdy people who are just sick of wasting time doing the same 20 extra steps over and over after a decade of doing them -- don't want that anymore. We want things to just work with a minimum of fuss.

    The stop-energy that the FOSS programmers keep throwing up is disgusting. Is it going to hurt you any if it's easier for other people to install software? Are you threatened? Do you want to keep the Linux userbase "pure" ? There is a problem. Even if it is a small problem, why for any reason do you not want any problem to be fixed? Especially one that is EASY to fix, if only people stopped blocking it every chance they get?

    Leave a comment:


  • deanjo
    replied
    Originally posted by elanthis View Post
    So instead of a single DVD, they have to ship 20 DVDs each containing a variant RPM/DPKG of the same 2GB of game file data? And that 20 DVD set will be out-dated and may no longer work within six months when all the distros refresh and swap up the default library install sets? Or, if this is a pure-Internet distribution, the distributor has to find hosting and maintain space for all those versions of the exact same binaries and data files?
    lol, ok you are really making sound ALOT worse then it is. First of all, ANY hard media will have the issue of being "outdated" as bugs are patched and fixed.

    Second of all, upgrades are easily handled with such a build system. You also do not need to repackage all the game assets. That part is actually really easy to do as a simple tar.bz2 file can be used for those items. The only thing that has to be distro specific is the installer. So you would have multiple installer distro specific installers in rpm and deb formats and one generic tarball that carries the assets. Also web distribution is typically a lot cheaper then hard media distribution and it reaches a greater audience. Steam/iTunes/etc all show that. It also usually means less "fingers in the pie" when it comes to the revenue stream.

    The per-distro packages are completely non-maintainable. Many of the actual distributions today are already having a lot of growing pains maintaining their repositories. Building the packages is just one tiny little part of the massive problem that the status quo of Linux installation has. Distributing, maintaining, debugging, and supporting the myriad of files necessary is very much a big part of the problem, and tools that just generate multiple packages don't solve any of that.
    Using a build service addresses alot of those pains and is easily updated to accommodate newer distro releases. Even patch rpm/debs are not that big of an issue.

    The space issue is also a huge problem with the distro silos. Let's pretend some big AAA game went fully Open Source. Would Fedora really be cool with having all of their mirrors add a 4GB game data file? And not just once; each version of Fedora would end up having that file duplicated to all the mirrors. Twice for each version: once for the base repository and again for the updates repository, assuming any patches for the game data are released. If the game were added to Fedora in version 15 then within two years that game alone would be consuming between 20GB and 40GB of space on every Fedora mirror.

    And then duplicate that for each version of Ubuntu, Debian, SUSE, Mandrake, Arch, etc. etc. etc. Most of the common mirrors hosting Linux distros now will throw a fit and probably start refusing to host the distros, or they'll force the distros to split up their repositories and add yet more maintenance burden. Instead, let the game project site deal with the data. Let them mirror it. Let them figure out the logistics, and only really need a small handful of mirrors for just that data that anyone of any distro can use.

    Hey that would be great if a game went entirely opensource. That way they wouldn't even have to bother with any associated cost of mirroring and distribution. There are plenty of examples like that on openSUSE's build service as it is right now. As it is opensource you are free to host it on their beefy mirror infrastructure at no cost.

    Really your making a mountain out of a molehill.

    Leave a comment:


  • elanthis
    replied
    Originally posted by Svartalf View Post
    Packaging doesn't fix that. I think the Puppygames "oops" of .debs should show that NONE of this is handled unless the vendor handles it for you- and .debs and .rpms do NOT change any of what you disclose as a problem. (Do keep in mind that even if it DID fix it, you'd have to get each distribution's rules, etc. right which means you'd have to make a package for each distribution. Not going to happen, just so you know...)
    I'm sorry, I'm not really sure what points you're responding to. I'm explicitly saying to use something other than .rpms and .debs to be cross-distro, to use a standardized set of platform definitions shipped with the installer framework that ensure that a well-defined set of libraries/ABIs are available (these are not shipped with the individual applications, because that leads to breakages like old apps shipping old SDL that don't mesh with newer Linux audio frameworks) to make dependency resolution cross-distro, and to maintain strict policies in the installers about file locations and the like (except for the .desktop files which are handled by the installer framework) to be cross-distro.

    Originally posted by Thetargos
    Why create a separate entity to handle the installation? (installer) with all that it would require (special file format, fetch the app, etc)
    It's possible to do that, sure. PackageKit's GUI would need a large amount of additions and updates for various features that it does not support now (as it's really just a veneer over apt/yum/etc. currently). If the PackageKit folks are game, that's probably a fine way to go.

    I have a strong suspicion though that PackageKit would rather have that installer be a separate binary that just uses the PackageKit libraries, though. In fact, PackageKit itself has no GUI at all. The GNOME and KDE GUIs you see are separate sub-projects of PackageKit itself.

    Originally posted by deanjo
    A lot of the packaging issues could be handled fairly easily if they would setup a suse build server
    So instead of a single DVD, they have to ship 20 DVDs each containing a variant RPM/DPKG of the same 2GB of game file data? And that 20 DVD set will be out-dated and may no longer work within six months when all the distros refresh and swap up the default library install sets? Or, if this is a pure-Internet distribution, the distributor has to find hosting and maintain space for all those versions of the exact same binaries and data files?

    The per-distro packages are completely non-maintainable. Many of the actual distributions today are already having a lot of growing pains maintaining their repositories. Building the packages is just one tiny little part of the massive problem that the status quo of Linux installation has. Distributing, maintaining, debugging, and supporting the myriad of files necessary is very much a big part of the problem, and tools that just generate multiple packages don't solve any of that.

    The space issue is also a huge problem with the distro silos. Let's pretend some big AAA game went fully Open Source. Would Fedora really be cool with having all of their mirrors add a 4GB game data file? And not just once; each version of Fedora would end up having that file duplicated to all the mirrors. Twice for each version: once for the base repository and again for the updates repository, assuming any patches for the game data are released. If the game were added to Fedora in version 15 then within two years that game alone would be consuming between 20GB and 40GB of space on every Fedora mirror.

    And then duplicate that for each version of Ubuntu, Debian, SUSE, Mandrake, Arch, etc. etc. etc. Most of the common mirrors hosting Linux distros now will throw a fit and probably start refusing to host the distros, or they'll force the distros to split up their repositories and add yet more maintenance burden. Instead, let the game project site deal with the data. Let them mirror it. Let them figure out the logistics, and only really need a small handful of mirrors for just that data that anyone of any distro can use.

    Going back to the application publishers hosting their own packages, this is still a problem if they need to repackage their data 20 times over. You have to pay for hosting space as a commercial entity, and Open/Free projects just need to find a willing set of mirrors. If you need to host multiple gigabytes of data because of artificial barriers to compatibility imposed by the packaging systems when you natively only have a few hundred megabytes of data, that's going to be a huge pain in the ass.

    Leave a comment:


  • Thetargos
    replied
    Ohh... And just to round up my idea...

    The necessary metadata for "package registration" with the host OS package management system (dpkg/rpm) could simply be found in the archive with all the requirements, description, components, manifest, etc, leaving PackageKit figure out how to register the app in the host OS. Also maybe have an option for local Vs "world wide" installation where if local, a single user could install onto their $HOME directory leaving out super user rights, and if system wide installation, bind to PolicyKit to get the necessary credentials.

    I know (and Svartalf would probably comment a bit further on this) that it would not be always possible to use "native" packages for some of the application's dependencies as the app may require some very specific version and symbols, hence providing the necessary dependency themselves (see the case with libstdc++ in many games)... However, it would be nice to use native libs whenever possible and letting native tools handle the dependency resolution (with the necessary metadata information provided by the app installation framework, of course).

    Leave a comment:


  • Thetargos
    replied
    It took me a while to go through it all in your post... I does seem you have it well thought through, however a few comments:

    Why create a separate entity to handle the installation? (installer) with all that it would require (special file format, fetch the app, etc)

    The way I see it, as a start it could be made much simpler and maybe easier if it would be possible to:

    Have the actual installer be part of PackageKit, the .bin/.run/whatever should only call it with the "right" parameters and "credentials" to start the install process, the host OS PackageKit would take care of the GUI (GTK+, QT, TCL-TK, nCurses, etc); whatever the host OS has installed as part of PackageKit and the environment (GNOME, KDE, LXDE, XFCE, etc). For Internet distributed packages, this would make much sense.

    For the case of media distributed programs it could also be made possible to have a single install.sh file with only the required arguments and the paths to fetch the files from the local media and parse that to PackageKit to handle that.

    There are a few tricky parts as you have exposed, like product validation and DRM, but I deffinitely think that spawning a different entity other than PackageKit would not be such a great idea in the long run, like the case you made with the installer to install the installer for app XYZ. Also relying on PackageKit the developers could easily rely on host OS libraries more effectively and only supply a few mandatory ones, or even install multiple instances and have the application use one specifically (though this wouldn't be much different from Windows where many apps install their supplied VC runtime libraries, for example). It could be made that for example, look first for the native packages and if a required version is not found for the distro, supply the remote repository/local media where it may be found.

    Leave a comment:


  • deanjo
    replied
    A lot of the packaging issues could be handled fairly easily if they would setup a suse build server, it may not cover all distros but it certainly covers the most commonly used distros, Fedora, openSUSE, *buntu, debian and mandriva.

    Leave a comment:

Working...
X