Announcement

Collapse
No announcement yet.

Preview: Ubuntu's Performance Over The Past Two Years

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by TheOne View Post
    With newer versions of apt-get a new command 'source' was introduce which basically downloads the original sources to generate the deb files on the repository servers. If you as a user have so much time to waste for building the software with custom cflags, etc... then you can basically

    apt-get source packagename
    cd packagename-vblah

    and recompile as much packages as you want.

    Also some users insinuated that you can't compile software on Ubuntu (debian based) systems but well that just sounds insane, you can always

    apt-get install build-essential
    wget sources.tar.gz
    tar -xvzf sources.tar.gz
    cd sources
    ./configure --prefix=/home/myuser/mysoftware
    make && make install

    But it is pretty easy to just apt-get source package, modify as needed and compile. Too much science for guru's out there willing to waste some time?

    Anyway I use Xubuntu for desktop, and Debian for servers so I can focus on the real work. It doesn't matter if my system is 500 milliseconds slower
    Or you could just use apt-build.

    Originally posted by JS987 View Post
    If you will install software with make install, you will break your machine sooner or later.
    Meh, that's mostly a myth, I did it for a long time without a single breakage.

    Comment


    • #22
      Originally posted by mrugiero View Post
      Meh, that's mostly a myth, I did it for a long time without a single breakage.
      You shouldn't break system if you always install in new directory software compiled with ./configure --prefix=/usr/local/software-123
      You can easily break system If you try to replace some software which was installed from deb package with newer version of software compiled with ./configure --prefix=/

      Comment


      • #23
        Originally posted by JS987 View Post
        You shouldn't break system if you always install in new directory software compiled with ./configure --prefix=/usr/local/software-123
        You can easily break system If you try to replace some software which was installed from deb package with newer version of software compiled with ./configure --prefix=/
        I'm aware, but it's really simple to avoid those problems. make install will surely not break your system if you do things right, and since doing them right is still considerably easier than packaging...

        Comment


        • #24
          Originally posted by mrugiero View Post
          I'm aware, but it's really simple to avoid those problems. make install will surely not break your system if you do things right, and since doing them right is still considerably easier than packaging...
          I was talking about upgrading/replacing software which is already installed from deb package. It isn't safe even if you create own package. make install is dangerous in that case.

          Comment


          • #25
            Originally posted by frign View Post
            .....then even Arch unfortunately sets limits.
            Careful with that, it's a bit misleading.Yes, Arch is a binary distro and while it does let your only include things you want it isn't as flexible as compiling your own..... However there is nothing stopping you from making your own packages or using the AUR. Though, I guess you could do some of that stuff in Ubuntu as well.. God I love Linux.

            Comment


            • #26
              Careful with that

              Originally posted by nightmarex View Post
              Careful with that, it's a bit misleading.Yes, Arch is a binary distro and while it does let your only include things you want it isn't as flexible as compiling your own..... However there is nothing stopping you from making your own packages or using the AUR. Though, I guess you could do some of that stuff in Ubuntu as well.. God I love Linux.
              Yes, no one stops me from recompiling software manually with just the features I want and then package them in my own repo.
              Theoretically, it's possible and works in most cases.

              Gentoo, however, does that for me automatically and allows me to set _global_ USE-flags, which is a tremendous simplification!

              If I wanted to strip PAM from all packages, I would have to first identify all packages depending on it (To be fair, that's still quite easy on Arch and Debian).
              Then I would have to get source-tarballs for the respective programs and compile them without PAM. Packaging would need to be done and it would be required to explicitly declare the package as independent from PAM-libs to prevent aptitude from pulling the library in accidentally.
              So there I go. If everything works, I can install the new packages and remove the PAM-components manually.
              In Gentoo, I just need to add "-pam" to $USE (in /etc/portage/make.conf) and the rest is done automatically.
              Removing PAM is just a trivial example. What about libpng? What about removing all traces of ConsoleKit?

              To be fair, we are talking about a binary-distribution and you have your freedoms by being able to package your own stuff. But if there is a library many programs pull in as a dependency unnecessarily (and that could be a security and is a performance problem), then you can't get around a source-based distribution like Gentoo, as repackaging is a waste of time when it involves many packages.

              I like to put it this way: On Gentoo, easy stuff is complex. The more complex the tasks get, the easier it is. Overall, you don't have to worry about the easy stuff once you've taken care of it .

              Overall, I love GNU/Linux for being that flexible . This would never be possible with Windows or Mac OS.
              Last edited by frign; 13 July 2013, 12:57 PM.

              Comment


              • #27
                QFT

                Originally posted by BO$$ View Post
                Not to mention unnecessary.
                Originally posted by BO$$
                Especially since it's unnecessary.
                qft.

                I don't think you are capable of perceiving the potential of these things. (As I already told you).

                Comment


                • #28
                  Originally posted by frign View Post
                  And now compare it with Gentoo ...

                  Ubuntu may have become faster, but it is still horrible bloatware.
                  I am interested in Gentoo, but I couldn't find benchmarks comparing it to recent versions of binary-based distros as Debian, Ubuntu, Fedora...

                  Do you know how many performance you get from going from a generic AMD_64 to compiling with optimizations? The own Gentoo site recommend the distro for owners of 8-cores, but does not give any detail on why.

                  Comment


                  • #29
                    Originally posted by BO$$ View Post
                    Yes I am. It's just masturbation. There is no need to compile locally, just to show off maybe.
                    It will likely contain less malware. No additional patches made by NSA.

                    Comment


                    • #30
                      Go for it!

                      Originally posted by juanrga View Post
                      I am interested in Gentoo, but I couldn't find benchmarks comparing it to recent versions of binary-based distros as Debian, Ubuntu, Fedora...

                      Do you know how many performance you get from going from a generic AMD_64 to compiling with optimizations? The own Gentoo site recommend the distro for owners of 8-cores, but does not give any detail on why.
                      You should really go for it! I started with Gentoo in December with a quad-core Mac mini, which literally is an 8-core machine when hyperthreading is activated. But normally, you could use Gentoo with a single-core machine, as compiling doesn't need to be supervised. For very large programs, binary-versions are available in case of problems.

                      Installing it for the first time is a _real_ challenge. I don't want to talk it down in any way: My first installation took over 8 hours and additional setup took days, but it was definitely worth it.
                      You can't compare it to normal GNU/Linux-distributions: My system boots in 3 seconds and you can literally tweak anything.

                      It's not about extreme compiler-flags (optimizations and the like), but more about what you compile into your software (shared libraries, generally speaking: dependencies).
                      If you use a binary distribution and install GIMP for instance, it pulls everything in. Support for udev, image-libraries, acl's and stuff.
                      You don't need most of it and compiling your own version of a given program can definitely yield positive results in regard to memory-consumption, performance and startup-speed.
                      Added to this come a tremendous package manager (portage), a great documentation and an awesome community.

                      I reinstalled Gentoo a few months ago (don't get me wrong, one setup can be literally used for decades) and knew a lot about the system by then. I was finished pretty quickly, as I could easily move all relevant configuration-files to the new setup.

                      All in all, the steep learning-curve is worth it. Go for it!

                      Comment

                      Working...
                      X