Announcement

Collapse
No announcement yet.

Third-party software installation for any Linux distribution

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by elanthis View Post
    Are those packages on the Platinum DVD that Joe User gets when he buys NWN? If they were, would those DVDs magically update with the newer packages when new distros come out? Or is the user saddled with the task of finding the correct installer for his flavor of Linux?
    Adding the repo is all that is needed. No different then having to hunt down a windows game patch. The installer could be as simple as a bash script that detects the distro version and downloads the appropriate rpms for their distro. Not exactly rocket scientist stuff.


    Are your buddies non-uber-nerds? If so, did they get this installed with no help or even mention of the packages' existence by yourself?
    Actually a couple of them are "non-uber nerds". If I had access to biowares ftp setting up a repo wouldn't be hard.

    Comment


    • #17
      Originally posted by deanjo View Post
      Adding the repo is all that is needed. No different then having to hunt down a windows game patch. The installer could be as simple as a bash script that detects the distro version and downloads the appropriate rpms for their distro. Not exactly rocket scientist stuff.
      I just disagree on the acceptable level of difficulty, then. I don't like having to hunt down game patches on Windows, either. (And I don't anymore, thank-you Steam.) The apps should auto-update.

      On Windows, this is a bitch because there is no standard software update service that any developer can tap into. Microsoft has theirs, Oracle ships a whole separate service for Java updates, OpenOffice.org has its own setup for OOo updates, Apple has the Apple Software Updater, and then any game that Valve accepts into Steam can use that. Oh, Adobe has their own, too. You end up running 20 services just to keep your software up to date. I do not want to emulate that model, obviously. This is something Linux does way better right now, today.

      I think it's a shame to be better in one way but not fully capitalize on it. One single installer for Linux that you can just click and it installs and then keeps things auto-updated: that's nice. Absolutely no sysadmin knowledge required, period.

      Distributors don't need to know a damn thing about distributions, don't need to rely on third-party build services, don't need to have a syadmin maintain repositories long after a product is no longer supported just to keep it installing. They build a single unified repo that works for all distros, they update it only if the application itself has updates, and that's it. Once they stop updating the app, they don't need to worry about the repo bit-rotting. It just sits on their FTP server and keeps on truckin'.

      Actually a couple of them are "non-uber nerds". If I had access to biowares ftp setting up a repo wouldn't be hard.
      Did they get it set up on their own, completely, no help or guidance?

      Maybe it really is just a little harder. Okay. I disagree, but let's say I'm wrong (it happens, I'll admit it).

      Why not make it _even easier_? There's just no good reason to settle for even "slightly harder" when it's possible to avoid it. What do you, or any else, lose by making it easier?

      If we avoided every slight improvement in usability because it was only marginally better than the status quo, we'd still all be using xfwm on non-composited X11 over Slackware.

      Originally posted by Thetargos
      Distributor A releases their game for Linux with an install script and a compressed archive with the game assets. Instead of shipping a GUI/TUI installer within the self extracting archive, have the install script check the system for basic necessary infrastructure (PackageKit), parse the installation parameters to the host OS PackageKit.
      That's more or less exactly what I'm thinking.

      I'm not sure if you're offering a counterpoint to my proposal; if so, we're on the same page totally here. I'm sorry if I wasn't clear in the first post: when I said the installer GUI app, that is to be something that is installed natively to the OS and comes with the OS, not something you distribute with the actual applications. It needs to be a part of the distro repository silos, essentially making that installer a core part of the "standard Linux platform."

      I don't think native integration into the system's package manager is mandatory. It would be very convenient and preferable, absolutely. I'd rather have it than not, certainly. But it's something that can be worked around if there are technical or social barriers to making that happen, which I expect at least at first.

      AutoPackage for instance maintained its own package database. It was annoying only because, at the time, PackageKit did not exist to provide a unified UI between the native system package DB and the AutoPackage DB, and AutoPackage did not include an updater service, only an installer.

      Comment


      • #18
        Originally posted by elanthis View Post

        I doubt you notice it a lot. I doubt most people on this forum notice it. Every single person here is an uber Linux nerd. When you have to do some crazy Linux thing, you don't even notice anymore. It's normal. You've done it for years. It's well-honed habit. Hell, compiling a kernel is _fun_. Shells are easier and more efficient to use than a GUI. GCC and Vim are an incredibly powerful configuration toolset.
        i lled when i read that.

        Even tho i understand all the nerdy stuff, i hate doing everything you just mentioned.

        Am an Ubuntu users (windows migrate) and most of us hate dealing with all that crap. If its not in a .deb installer we barely touch it.

        Anyway, i just dont see why other distros should have as much priority as debian based distros.

        If you check the avg. stats at the bottom from this counter:

        http://www.dudalibre.com/gnulinuxcounter?lang=en

        You'll see that ubuntu has 60% and debian has 20%. Plus all other debian/ubuntu based have like 10% more.

        So Debian/ubuntu count for 85%+ linux marketshare. Which i "suspect" is not so far off from the real deal.

        So if i were to launch a game i would just release 1 .deb installer and 1 compressed file with instructions for all the others. So, if you're using a distro other than ubuntu you should be the "Uber-geek" that can handle that sort of stuff.

        At least i know Ubuntu is already dedicated to solve that problem for their distro.

        Sure, I would love 1 universal installer that works really well and becomes a standard, but as far as i can see that is just not going to happen soon (but i would love to be wrong ).

        Comment


        • #19
          I feel that you are attacking the problem from the wrong angle. A universal installer is not the solution (we have enough of those already). What we need is a universal solution for *creating* installers, similar to how cmake works for build systems.

          From a high-level, your typical application has the following components:
          1. executables
          2. dependencies
          3. data files
          4. global configuration
          5. user configuration
          6. documentation
          7. license agreement

          And that's it. You create a simple script that documents which files belong to each category, you feed it to the tool and it spits out the scripts that build the final .deb, .rpm, .tar.gz (and even .exe and .dmg) files. Some trivial browser detection and the user will never have to deal with the installer issue again. See http://www.opera.com/browser/download/ for a perfect, real-world example of this.

          This is how cmake works: it doesn't build your code (that's extremely difficult given the multitude of platforms), it simply generates makefiles for your platform and lets the building be handled by the compiler/IDE. This is the only sane approach.

          The main difficulty is the handling of dependencies. Simple if you bundle these as binaries, but rather more difficult if you wish to use the shared libraries from downstream. Again, cmake shows how to do this: it provides invocable scripts that detect most common dependencies (OpenGL, OpenAL, SDL, GLUT, etc) and you can create (or download) your own scripts for dependencies that are not recognized out of the box. Simple and intuitive!

          As the tool matures, it will gain support for the individual packaging styles of each distro (i.e. Fedora and SuSE split things in different ways, despite both being rpm-based.) The community can help here tremendously, if the foundations are right.

          Seriously, something like this would simplify the lives of both open-source and closed-source developers.

          Comment


          • #20
            (fake edit)
            Auto-updates should be handled by adding a software source to the user's package manager. Not all installable software will make use of this functionality, but it should be an option when creating the installer scripts.

            Comment


            • #21
              Originally posted by elanthis View Post
              Did they get it set up on their own, completely, no help or guidance?
              Hey they made it through installing linux on their own which requires more effort then installing nwn.

              Comment


              • #22
                Originally posted by deanjo View Post
                Hey they made it through installing linux on their own which requires more effort then installing nwn.
                That almost qualifies them as uber-nerds then.

                Comment


                • #23
                  @elanthis

                  I wasn't offering a counter point, rather saying that things could be much simpler and use existing infrastructure, no need to "reinvent the wheel" or "recreate the black thread", and that at some point it kinda looks as if you are over convoluting things.


                  The "installer" the way I see it, should not "live in the repos", rather inside the distro, be intrinsic to the distro with a common interface for the local package management infrastructure, and "self installable pacakges" could be analogous as the ODF files (simply specially crafted .zip files with a distinct extenstion) in that the internal heriarchy and the XML descriptors define the type of file and what it does (and what makes an ods, odp, odt file a spreadsheet, presentation or text file, respectively).

                  I wouldn't advise starting using those as general packages, though; make it strictly for third party software that wouldn't otherwise deal with native package management policies/hierarchies/licenses/etc.

                  Comment


                  • #24
                    Originally posted by Thetargos View Post
                    @elanthis

                    I wasn't offering a counter point, rather saying that things could be much simpler and use existing infrastructure
                    Well, the problem is, there isn't an infrastructure. There's RPM, DPKG, yum, apt, and the other native package tools. That's it. There is no magical PackageKit infrastructure like you are implying.

                    PackageKit does not replace or enhance the native package system. It's just a veneer over it. Literally, it's just a library and a GUI that abstracts "yum install" and "apt-get install" to "pkcon install", combined with some policies about to make finding dependencies a bit easier. PackageKit knows absolutely nothing about how to unpack packages, resolve dependencies, download updates, or anything. All it does is call out to a yum or apt backend to do all the actual work. PackageKit is entirely and utterly incapable of installing any software that is not already packaged in the system's native package format and likewise incapable of updating any software not in the distro's native repository format.

                    So there has to be a new format defined, or you have to use one of the existing formats and use something like alien to get them into the native package databases. This is the path LSB tried to take by mandating RPM 3.x as the official package format; unfortunately, they left too much of the rest of the process completely unspecified. Plus RPMs and the like don't support licenses/EULAs which while distasteful are mandatory for most third-party publishers (heck, even a lot of Open Source/Free Software installers use them!).

                    Given the limitations of the existing formats, I feel it is best to create a new one. I have in the past proposed updates to both dpkg and rpm for more application-oriented installers and they have been roundly rejected because the developers think their locked-down package silos are a Freedom-saving feature rather than a user-hostile design flaw.

                    The "installer" the way I see it, should not "live in the repos", rather inside the distro, be intrinsic to the distro with a common interface for the local package management infrastructure
                    Which means the installer is a package in the distribution's package repository. All software in the distribution comes from its package repository, including (for example) rpm, PackageKit, yum, and so on. So the installer must be a part of the distribution's repository.

                    What I meant is the same thing you meant: the installer has to be a part of the distribution's native package set that the distribution ships and supports, not something the user must manually install by himself.

                    It doesn't have to be installed by default. The MIME/extension lookup feature Nautilus/Konquerer/Firefox implement via PackageKit means that the native packaging system can install the installer package on demand the first time the user clicks on an installer file. This is important, because I can guarantee you that the installer will not be in the default install set, because distros like Fedora have very strict policies and a lot of politics involved in deciding what goes into that default install set. Luckily, it doesn't matter. So long as the installer is in Fedora's repository, it'll Just Work if the user ever needs it. Magic!

                    I wouldn't advise starting using those as general packages, though; make it strictly for third party software that wouldn't otherwise deal with native package management policies/hierarchies/licenses/etc.
                    I have no intention of replacing the core package infrastructure of a distribution. I'm not sure that even makes sense: the way you manage a _platform_ and the way you manage an _application_ are fundamentally different. You can build a single system that does both, but said system will be highly complex. RPM and DPKG for instance are massively complex beasts, and still have a crappy user-experience for handling applications. Fedora/RedHat for instance relies on a whole separate database (comps) for logically grouping the plethora of packages that make up a single application into a user-friendly bundle... and the user still ends up being exposed to a ton of firefox-foobazbigglerarr package names in various bits of the graphics UI.

                    The way I've specified things is tightly tailored for actual applications (things that run in the GUI, get a menu entry, and of which the user is intended to be acutely aware of), based on how actual application-oriented installers for both Windows and Linux have worked for many years. It looks big and complicated only because it's so different than how RPM and DPKG and such work today, because they're just not designed for application-oriented experiences.

                    If successful, it might be beneficial to expand beyond pure applications and into frameworks and plugins as well. Those have slightly different user stories than an application, slightly different requirements, but nothing too terribly onerous to add. Better to stay focused at first, though. Frameworks basically need to depend on development tools which can be a bit trickier to rely on ("C Development Environment" is not something you can cleanly define as a platform), and plugins require dependencies on applications/frameworks on potentially very specific versions (.e.g, this plugin works with LibAudioFoo version 1.1.3, because the authors of LibAudioFoo are asshats and break their plugin ABI every point-release; so you end up needing dependent updates or multiple installed versions, which a pure application has absolutely no need for period).

                    (Suffice to say, after packaging for four distributions over the last 10 years and doing installer maintenance on Windows for almost as long, I've been putting a LOT of thought into this subject over the years. Just haven't had the gumption to do anything about it because all of the distributions have been very, very hostile to the idea of taking away their control over users' software, and I've seen every similar -- if less complete -- attempt over the years go through a ton of work just to rot out in the sun because the distributions wouldn't accept them. More often than not out of the fear that users will install Evil Immoral Proprietary Software... like games.)

                    Comment


                    • #25
                      Originally posted by elanthis View Post
                      That almost qualifies them as uber-nerds then.
                      I wouldn't say so. It is not hard to install most of todays distributions at all. Neither is installing software on them for the most part. Software management programs have come a long way in figuring out dependencies and the likes. Far simpler then in windows, IMHO, where you have to search high and low for one piece of software over here and another over there, etc etc. Macs probably have the simplest installation / deinstallation procedure, drag and drop. Where I do, however, think that linux is still in the dark ages is the many situations where manual editing of files, boot options, etc is required for hardware support. Why the fuck a person has to decide things like v4l2 kernel driver options at boot is beyond me. Hell you show a person a MythTV setup vs a Windows MCE setup and there you will get the puzzled looks and the "Linux is hard" arguments. Linux may support a lot of hardware but getting that hardware to a point where it is usable takes far to much headbanging in it's present form.

                      Comment


                      • #26
                        1 distro for the avg. user, thats all am going to say.

                        Comment


                        • #27
                          I wonder how Desura is going to handle game installation on Linux (http://www.desura.com/). It seems like they would benefit from a project like this.

                          Comment


                          • #28
                            Originally posted by ad_267 View Post
                            I wonder how Desura is going to handle game installation on Linux (http://www.desura.com/). It seems like they would benefit from a project like this.
                            Probably the same way that Steam on Windows (or some mythical Steam on Liunx) would: by being its own packaging and update system. Some (but not all) Steam games seem to register with the Windows application list, so I imagine Desura and similar apps would enjoy having a framework available to them to do so, but it's not really something they need at all. They've already got all the code to download and install binaries and data and keep them updated.

                            Comment


                            • #29
                              It's still not uncommon to find software for Windows that doesn't have an installer, but instead comes in a zip file you unpack and then run the exe. This is practically identical to the tarball approach, and novice win users are okay with these apps too. What's your opinion on that?

                              Comment


                              • #30
                                Originally posted by elanthis View Post
                                That almost qualifies them as uber-nerds then.
                                Only if it was arch they installed :P

                                Like others have pointed out, I also think you're making a big deal out of something not that important. Sure, granny may have a hard time installing third-party software via tarballs, but the average computer user upon stumbling on a problem already knows that typing the problem in google will eventually reveal a solution.

                                PS: What do you mean by NVN being hard to install? Just the other day (when I found out that there was a linux client... yes, under a rock that's right) I downloaded the resources, client and last patch from the server, uncompressed everything and it was working.

                                Comment

                                Working...
                                X