Announcement

Collapse
No announcement yet.

APT 1.3 Released For Debian Linux Distributions

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • APT 1.3 Released For Debian Linux Distributions

    Phoronix: APT 1.3 Released For Debian Linux Distributions

    APT 1.3 is now available as the newest version of this Debian command-line package manager...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    New corner case features, while stuff like doing multiple installs at a time is still not a thing. As if they wanted to copy MS's shitty msi installer...

    Comment


    • #3
      Originally posted by eydee View Post
      New corner case features, while stuff like doing multiple installs at a time is still not a thing. As if they wanted to copy MS's shitty msi installer...
      Does any OS has multiple installs at a time? It does not seem to be that easy...

      Comment


      • #4
        Originally posted by Passso View Post

        Does any OS has multiple installs at a time? It does not seem to be that easy...
        It does not seem that possible to me even. I mean, each package comes bundled with scripts and no one can guarantee said script can be run simultaneously. The best I think can be done, would be parallel downloads. Maybe install could once the first package is available (together with dependencies).

        Comment


        • #5
          Originally posted by bug77 View Post

          It does not seem that possible to me even. I mean, each package comes bundled with scripts and no one can guarantee said script can be run simultaneously. The best I think can be done, would be parallel downloads. Maybe install could once the first package is available (together with dependencies).
          I *think* (not sure) that apt already does parallel download, at least if the urls are from different domains. It wouldn't really make sense to start two downloads from ftp.debian.org (assuming a single download can saturate the available bandwidth), but if you install google-chrome at the same time (e.g. apt install google-chrome firefox) I think they are downloaded in parallel.

          ----------
          Sure enough, I just tried it, notice the last line:

          Code:
          $ sudo apt-get install --reinstall synaptic google-chrome-stable
          Reading package lists... Done
          Building dependency tree       
          Reading state information... Done
          0 upgraded, 0 newly installed, 2 reinstalled, 0 to remove and 0 not upgraded.
          Need to get 51.7 MB of archives.
          After this operation, 0 B of additional disk space will be used.
          Get:1 http://ftp.debian.org/debian/ jessie/main synaptic amd64 0.81.2 [1,522 kB]
          Get:2 http://dl.google.com/linux/chrome/deb/ stable/main google-chrome-stable amd64 53.0.2785.116-1 [50.2 MB]
          8% [2 google-chrome-stable 2,830 kB/50.2 MB 6%] [1 synaptic 1,335 kB/1,522 kB 88%]

          Comment


          • #6
            Hm, never noticed that. I believe even parallel download from the same domain makes sense because you could be hitting different physical servers.
            But what are we talking about here, download over BitTorrent is the future. Right?

            Comment


            • #7
              Yeah, I hardly ever notice apt doing that, since it's usually from the same domain... And I agree with you: using parallel downloads even for the same domain might be beneficial.

              Comment


              • #8
                APT is still very limited in its parallelism. They only just recently added parallel downloads, but there is room for more improvements.

                - Parallel downloads are a first step, but don't help that much when the total bandwidth is limited.

                - Parallel decompressing and unpacking is a future possibility, but will also be limited by the bandwidth of the file system. Decompression runs pretty fast already and decompressing multiple files in parallel might lead to increased file fragmentation and only bring a small performance gain.

                - Parallel setup scripts and copy/mv/install processes is then another future possibility, but these need to respect the package dependencies - one cannot install a package before its dependencies have been installed. But then again there is a bit of room, which can be explored (i.e. parallel install of packages, which have no known dependencies to one another like libraries, includes, documentation, fonts, etc.).

                In short, there is room for it, but it is definitely not as simple as forking N processes for N packages and hoping they don't overwrite one another or trash system and user configuration files.

                I have not yet seen any OS that could do all this, but I also haven't seen that many. Nobody likes to mess with this sort of thing when a single bug could end up wrecking your entire OS installation, which is why it's being done conservatively.

                Comment


                • #9
                  Originally posted by Passso View Post
                  Does any OS has multiple installs at a time? It does not seem to be that easy...
                  Gentoo does.

                  Comment


                  • #10
                    Originally posted by fuzz View Post
                    Gentoo does.
                    I suppose this depends on your definition of "multiple installs at a time". Are we talking about:

                    1.) Having multiple versions of a software installed at the same time. For example, having the same library installed but as two different versions.
                    2.) Downloading, unpacking and installing multiple packages at a time where the installation process itself runs sequentially, but allows for the installation of multiple packages including conflict resolution, resolving dependencies and removal of packages.
                    3.) Doing it across multiple hosts, sandboxes and containers at the same time either sequential or parallel.
                    4.) Downloading, unpacking and installing multiple packages concurrently on multiple CPUs/cores, possibly even lockless, carefree and for a maximum in performance.
                    5.) The one that got away...
                    6.) Or any combination of the above.

                    Most Linux distributions can do 1.) and 2.) and offer tools for 3.). I doubt no distro will be able to do the 4.) or cares for such explicit parallelism.

                    Comment

                    Working...
                    X