Announcement

Collapse
No announcement yet.

Preview: Ubuntu's Performance Over The Past Two Years

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by Calinou View Post
    To be honest, +1, I use Xubuntu myself which tends to be the best compromise (no distro being perfect).

    It's not worth spending hours making your system from nothing (have fun reinstalling) and even less compiling.
    Thats why I grab a copy of ubuntu strip out all the spyware-crapware and apply all the mods-tweaks from my afterinstall.txt. So far it seems the easiest way. LTS version only always.
    After switching machines then nuking my install a second time I decided having a file with all the mods/tweaks is a much quicker way to go.

    These graphs make ubuntu look like its all over the place. When will it settle?

    Comment


    • #17
      Originally posted by JS987 View Post
      create own deb package
      Why would you do that? Deb packages are practical for distribution, but if you are already building it locally, just do make install and don't bother packaging. A different thing is if you have several machines which you want to update.

      Originally posted by Calinou View Post
      To be honest, +1, I use Xubuntu myself which tends to be the best compromise (no distro being perfect).

      It's not worth spending hours making your system from nothing (have fun reinstalling) and even less compiling.
      You both say this like Ubuntu was the first binary distro ever...

      Originally posted by frign View Post
      Sorry, I might not have been clear enough. I was talking about USE-flags and that is only achievable by recompiling the software and actually remove the dynlib-dependency in the binary itself. So I agree with your points. Arch is a _fine distribution_ and I like how they handle meta-packages, but if you want to go the full way (like I do with my Gentoo-system) and iron everything out you don't need, then even Arch unfortunately sets limits.

      With Gentoo, I have fun tinkering around with the Kernel and stripping unneeded use-flags. It turns out, that the less you pull in and use, the less ways there are to break your system and it is effectively more reliable and snappier.
      Of course, you have to edit config-files and the like, but once you've set everything up in the first place, you can use your system for _years_ without problems.

      I want to make clear, though, that Gentoo is of course not for everyone. I don't know why, but we got kind of used to reinstalling our operating systems every year or less. Setting up a "normal" distro is easy and fast, but you got to live with problems which might occur while updating or changing a setting or even installing new software.
      Knowing what you've set up manually in the first place helps you understand the system and, in the long run, allows you to fix problems easily, saving time.

      There is no perfect distribution. If you want to be flexible, you have to invest time and read manuals . If you want to set-up things quickly, Arch is one good way to go.
      I know what you mean, but since I usually just do things the hard way (learning by breaking) I didn't know that terminology, I always called USE-flags build time options. I used to build a custom kernel on a via machine to test James Simmons KMS driver, and since I was there I did the same as you. I also used to built the desktop, just for fun, but I stopped after the last distro upgrade (I use Xubuntu), I'm not really sure why now.
      Last edited by mrugiero; 07-12-2013, 09:33 PM.

      Comment


      • #18
        apt-get source?

        With newer versions of apt-get a new command 'source' was introduce which basically downloads the original sources to generate the deb files on the repository servers. If you as a user have so much time to waste for building the software with custom cflags, etc... then you can basically

        apt-get source packagename
        cd packagename-vblah

        and recompile as much packages as you want.

        Also some users insinuated that you can't compile software on Ubuntu (debian based) systems but well that just sounds insane, you can always

        apt-get install build-essential
        wget sources.tar.gz
        tar -xvzf sources.tar.gz
        cd sources
        ./configure --prefix=/home/myuser/mysoftware
        make && make install

        But it is pretty easy to just apt-get source package, modify as needed and compile. Too much science for guru's out there willing to waste some time?

        Anyway I use Xubuntu for desktop, and Debian for servers so I can focus on the real work. It doesn't matter if my system is 500 milliseconds slower
        Last edited by TheOne; 07-13-2013, 01:27 AM.

        Comment


        • #19
          Originally posted by mrugiero View Post
          Why would you do that? Deb packages are practical for distribution, but if you are already building it locally, just do make install and don't bother packaging. A different thing is if you have several machines which you want to update.
          If you will install software with make install, you will break your machine sooner or later.

          Comment


          • #20
            Originally posted by TheOne View Post
            With newer versions of apt-get a new command 'source' was introduce which basically downloads the original sources to generate the deb files on the repository servers. If you as a user have so much time to waste for building the software with custom cflags, etc... then you can basically

            apt-get source packagename
            cd packagename-vblah

            and recompile as much packages as you want.

            Also some users insinuated that you can't compile software on Ubuntu (debian based) systems but well that just sounds insane, you can always

            apt-get install build-essential
            wget sources.tar.gz
            tar -xvzf sources.tar.gz
            cd sources
            ./configure --prefix=/home/myuser/mysoftware
            make && make install

            But it is pretty easy to just apt-get source package, modify as needed and compile. Too much science for guru's out there willing to waste some time?

            Anyway I use Xubuntu for desktop, and Debian for servers so I can focus on the real work. It doesn't matter if my system is 500 milliseconds slower
            You shouldn't need to create own package. On rolling distros, packages are usually already created. You install them with one command which is usually automatized.

            Comment


            • #21
              Originally posted by TheOne View Post
              With newer versions of apt-get a new command 'source' was introduce which basically downloads the original sources to generate the deb files on the repository servers. If you as a user have so much time to waste for building the software with custom cflags, etc... then you can basically

              apt-get source packagename
              cd packagename-vblah

              and recompile as much packages as you want.

              Also some users insinuated that you can't compile software on Ubuntu (debian based) systems but well that just sounds insane, you can always

              apt-get install build-essential
              wget sources.tar.gz
              tar -xvzf sources.tar.gz
              cd sources
              ./configure --prefix=/home/myuser/mysoftware
              make && make install

              But it is pretty easy to just apt-get source package, modify as needed and compile. Too much science for guru's out there willing to waste some time?

              Anyway I use Xubuntu for desktop, and Debian for servers so I can focus on the real work. It doesn't matter if my system is 500 milliseconds slower
              Or you could just use apt-build.

              Originally posted by JS987 View Post
              If you will install software with make install, you will break your machine sooner or later.
              Meh, that's mostly a myth, I did it for a long time without a single breakage.

              Comment


              • #22
                Originally posted by mrugiero View Post
                Meh, that's mostly a myth, I did it for a long time without a single breakage.
                You shouldn't break system if you always install in new directory software compiled with ./configure --prefix=/usr/local/software-123
                You can easily break system If you try to replace some software which was installed from deb package with newer version of software compiled with ./configure --prefix=/

                Comment


                • #23
                  Originally posted by JS987 View Post
                  You shouldn't break system if you always install in new directory software compiled with ./configure --prefix=/usr/local/software-123
                  You can easily break system If you try to replace some software which was installed from deb package with newer version of software compiled with ./configure --prefix=/
                  I'm aware, but it's really simple to avoid those problems. make install will surely not break your system if you do things right, and since doing them right is still considerably easier than packaging...

                  Comment


                  • #24
                    Originally posted by mrugiero View Post
                    I'm aware, but it's really simple to avoid those problems. make install will surely not break your system if you do things right, and since doing them right is still considerably easier than packaging...
                    I was talking about upgrading/replacing software which is already installed from deb package. It isn't safe even if you create own package. make install is dangerous in that case.

                    Comment


                    • #25
                      Originally posted by frign View Post
                      .....then even Arch unfortunately sets limits.
                      Careful with that, it's a bit misleading.Yes, Arch is a binary distro and while it does let your only include things you want it isn't as flexible as compiling your own..... However there is nothing stopping you from making your own packages or using the AUR. Though, I guess you could do some of that stuff in Ubuntu as well.. God I love Linux.

                      Comment


                      • #26
                        Careful with that

                        Originally posted by nightmarex View Post
                        Careful with that, it's a bit misleading.Yes, Arch is a binary distro and while it does let your only include things you want it isn't as flexible as compiling your own..... However there is nothing stopping you from making your own packages or using the AUR. Though, I guess you could do some of that stuff in Ubuntu as well.. God I love Linux.
                        Yes, no one stops me from recompiling software manually with just the features I want and then package them in my own repo.
                        Theoretically, it's possible and works in most cases.

                        Gentoo, however, does that for me automatically and allows me to set _global_ USE-flags, which is a tremendous simplification!

                        If I wanted to strip PAM from all packages, I would have to first identify all packages depending on it (To be fair, that's still quite easy on Arch and Debian).
                        Then I would have to get source-tarballs for the respective programs and compile them without PAM. Packaging would need to be done and it would be required to explicitly declare the package as independent from PAM-libs to prevent aptitude from pulling the library in accidentally.
                        So there I go. If everything works, I can install the new packages and remove the PAM-components manually.
                        In Gentoo, I just need to add "-pam" to $USE (in /etc/portage/make.conf) and the rest is done automatically.
                        Removing PAM is just a trivial example. What about libpng? What about removing all traces of ConsoleKit?

                        To be fair, we are talking about a binary-distribution and you have your freedoms by being able to package your own stuff. But if there is a library many programs pull in as a dependency unnecessarily (and that could be a security and is a performance problem), then you can't get around a source-based distribution like Gentoo, as repackaging is a waste of time when it involves many packages.

                        I like to put it this way: On Gentoo, easy stuff is complex. The more complex the tasks get, the easier it is. Overall, you don't have to worry about the easy stuff once you've taken care of it .

                        Overall, I love GNU/Linux for being that flexible . This would never be possible with Windows or Mac OS.
                        Last edited by frign; 07-13-2013, 12:57 PM.

                        Comment


                        • #27
                          QFT

                          Originally posted by BO$$ View Post
                          Not to mention unnecessary.
                          Originally posted by BO$$
                          Especially since it's unnecessary.
                          qft.

                          I don't think you are capable of perceiving the potential of these things. (As I already told you).

                          Comment


                          • #28
                            Originally posted by frign View Post
                            And now compare it with Gentoo ...

                            Ubuntu may have become faster, but it is still horrible bloatware.
                            I am interested in Gentoo, but I couldn't find benchmarks comparing it to recent versions of binary-based distros as Debian, Ubuntu, Fedora...

                            Do you know how many performance you get from going from a generic AMD_64 to compiling with optimizations? The own Gentoo site recommend the distro for owners of 8-cores, but does not give any detail on why.

                            Comment


                            • #29
                              Originally posted by BO$$ View Post
                              Yes I am. It's just masturbation. There is no need to compile locally, just to show off maybe.
                              It will likely contain less malware. No additional patches made by NSA.

                              Comment


                              • #30
                                Go for it!

                                Originally posted by juanrga View Post
                                I am interested in Gentoo, but I couldn't find benchmarks comparing it to recent versions of binary-based distros as Debian, Ubuntu, Fedora...

                                Do you know how many performance you get from going from a generic AMD_64 to compiling with optimizations? The own Gentoo site recommend the distro for owners of 8-cores, but does not give any detail on why.
                                You should really go for it! I started with Gentoo in December with a quad-core Mac mini, which literally is an 8-core machine when hyperthreading is activated. But normally, you could use Gentoo with a single-core machine, as compiling doesn't need to be supervised. For very large programs, binary-versions are available in case of problems.

                                Installing it for the first time is a _real_ challenge. I don't want to talk it down in any way: My first installation took over 8 hours and additional setup took days, but it was definitely worth it.
                                You can't compare it to normal GNU/Linux-distributions: My system boots in 3 seconds and you can literally tweak anything.

                                It's not about extreme compiler-flags (optimizations and the like), but more about what you compile into your software (shared libraries, generally speaking: dependencies).
                                If you use a binary distribution and install GIMP for instance, it pulls everything in. Support for udev, image-libraries, acl's and stuff.
                                You don't need most of it and compiling your own version of a given program can definitely yield positive results in regard to memory-consumption, performance and startup-speed.
                                Added to this come a tremendous package manager (portage), a great documentation and an awesome community.

                                I reinstalled Gentoo a few months ago (don't get me wrong, one setup can be literally used for decades) and knew a lot about the system by then. I was finished pretty quickly, as I could easily move all relevant configuration-files to the new setup.

                                All in all, the steep learning-curve is worth it. Go for it!

                                Comment

                                Working...
                                X