Announcement

Collapse
No announcement yet.

Preview: Ubuntu's Performance Over The Past Two Years

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by BO$$ View Post
    Outdated software on Ubuntu? Means Red Hat Enterprise Linux must be for idiots then. Or maybe people like stability a little more than using nightbuilds hmmm??
    If new stable version of software is released after new Ubuntu release, you can wait up to 6 months or you have to create own deb package which is hard or impossible because of dependencies.
    Red Hat is used by companies which don't mind outdated software. Many of them still use Windows XP.

    Comment


    • #12
      Arch is fine!

      Originally posted by mrugiero View Post
      If the point is deciding which software to install, I think it's simpler to use Arch. Except you actually mean what features your software builds, in that case, the easier it's to build your own, the better, and I agree you should go with Gentoo.
      Sorry, I might not have been clear enough. I was talking about USE-flags and that is only achievable by recompiling the software and actually remove the dynlib-dependency in the binary itself. So I agree with your points. Arch is a _fine distribution_ and I like how they handle meta-packages, but if you want to go the full way (like I do with my Gentoo-system) and iron everything out you don't need, then even Arch unfortunately sets limits.

      With Gentoo, I have fun tinkering around with the Kernel and stripping unneeded use-flags. It turns out, that the less you pull in and use, the less ways there are to break your system and it is effectively more reliable and snappier.
      Of course, you have to edit config-files and the like, but once you've set everything up in the first place, you can use your system for _years_ without problems.

      I want to make clear, though, that Gentoo is of course not for everyone. I don't know why, but we got kind of used to reinstalling our operating systems every year or less. Setting up a "normal" distro is easy and fast, but you got to live with problems which might occur while updating or changing a setting or even installing new software.
      Knowing what you've set up manually in the first place helps you understand the system and, in the long run, allows you to fix problems easily, saving time.

      There is no perfect distribution. If you want to be flexible, you have to invest time and read manuals . If you want to set-up things quickly, Arch is one good way to go.
      Last edited by frign; 07-12-2013, 01:06 PM.

      Comment


      • #13
        Originally posted by BO$$ View Post
        Well 13.04 uses 3.8 so if they backported than yay. Go Canonical! They understand what is important for the user and execute fast and efficient! That's why they are number one.
        It's not even enabled by default in 3.11 because it's still untested. There's no way Canonical backported and enabled by default. Sounds like the placebo effect going on.

        Comment


        • #14
          Originally posted by smitty3268 View Post
          It's not even enabled by default in 3.11 because it's still untested. There's no way Canonical backported and enabled by default. Sounds like the placebo effect going on.
          Maybe he smoked some of Shuttleworths chest hair.

          Comment


          • #15
            Originally posted by BO$$ View Post
            Nobody with a life uses it and you know it. Nobody spends time compiling their own shit. Canonical got this. Now it's time for you to get it. And leave the basement. We want things to work out of the box without doing anything.
            To be honest, +1, I use Xubuntu myself which tends to be the best compromise (no distro being perfect).

            It's not worth spending hours making your system from nothing (have fun reinstalling) and even less compiling.

            Comment


            • #16
              Originally posted by Calinou View Post
              To be honest, +1, I use Xubuntu myself which tends to be the best compromise (no distro being perfect).

              It's not worth spending hours making your system from nothing (have fun reinstalling) and even less compiling.
              Thats why I grab a copy of ubuntu strip out all the spyware-crapware and apply all the mods-tweaks from my afterinstall.txt. So far it seems the easiest way. LTS version only always.
              After switching machines then nuking my install a second time I decided having a file with all the mods/tweaks is a much quicker way to go.

              These graphs make ubuntu look like its all over the place. When will it settle?

              Comment


              • #17
                Originally posted by JS987 View Post
                create own deb package
                Why would you do that? Deb packages are practical for distribution, but if you are already building it locally, just do make install and don't bother packaging. A different thing is if you have several machines which you want to update.

                Originally posted by Calinou View Post
                To be honest, +1, I use Xubuntu myself which tends to be the best compromise (no distro being perfect).

                It's not worth spending hours making your system from nothing (have fun reinstalling) and even less compiling.
                You both say this like Ubuntu was the first binary distro ever...

                Originally posted by frign View Post
                Sorry, I might not have been clear enough. I was talking about USE-flags and that is only achievable by recompiling the software and actually remove the dynlib-dependency in the binary itself. So I agree with your points. Arch is a _fine distribution_ and I like how they handle meta-packages, but if you want to go the full way (like I do with my Gentoo-system) and iron everything out you don't need, then even Arch unfortunately sets limits.

                With Gentoo, I have fun tinkering around with the Kernel and stripping unneeded use-flags. It turns out, that the less you pull in and use, the less ways there are to break your system and it is effectively more reliable and snappier.
                Of course, you have to edit config-files and the like, but once you've set everything up in the first place, you can use your system for _years_ without problems.

                I want to make clear, though, that Gentoo is of course not for everyone. I don't know why, but we got kind of used to reinstalling our operating systems every year or less. Setting up a "normal" distro is easy and fast, but you got to live with problems which might occur while updating or changing a setting or even installing new software.
                Knowing what you've set up manually in the first place helps you understand the system and, in the long run, allows you to fix problems easily, saving time.

                There is no perfect distribution. If you want to be flexible, you have to invest time and read manuals . If you want to set-up things quickly, Arch is one good way to go.
                I know what you mean, but since I usually just do things the hard way (learning by breaking) I didn't know that terminology, I always called USE-flags build time options. I used to build a custom kernel on a via machine to test James Simmons KMS driver, and since I was there I did the same as you. I also used to built the desktop, just for fun, but I stopped after the last distro upgrade (I use Xubuntu), I'm not really sure why now.
                Last edited by mrugiero; 07-12-2013, 09:33 PM.

                Comment


                • #18
                  apt-get source?

                  With newer versions of apt-get a new command 'source' was introduce which basically downloads the original sources to generate the deb files on the repository servers. If you as a user have so much time to waste for building the software with custom cflags, etc... then you can basically

                  apt-get source packagename
                  cd packagename-vblah

                  and recompile as much packages as you want.

                  Also some users insinuated that you can't compile software on Ubuntu (debian based) systems but well that just sounds insane, you can always

                  apt-get install build-essential
                  wget sources.tar.gz
                  tar -xvzf sources.tar.gz
                  cd sources
                  ./configure --prefix=/home/myuser/mysoftware
                  make && make install

                  But it is pretty easy to just apt-get source package, modify as needed and compile. Too much science for guru's out there willing to waste some time?

                  Anyway I use Xubuntu for desktop, and Debian for servers so I can focus on the real work. It doesn't matter if my system is 500 milliseconds slower
                  Last edited by TheOne; 07-13-2013, 01:27 AM.

                  Comment


                  • #19
                    Originally posted by mrugiero View Post
                    Why would you do that? Deb packages are practical for distribution, but if you are already building it locally, just do make install and don't bother packaging. A different thing is if you have several machines which you want to update.
                    If you will install software with make install, you will break your machine sooner or later.

                    Comment


                    • #20
                      Originally posted by TheOne View Post
                      With newer versions of apt-get a new command 'source' was introduce which basically downloads the original sources to generate the deb files on the repository servers. If you as a user have so much time to waste for building the software with custom cflags, etc... then you can basically

                      apt-get source packagename
                      cd packagename-vblah

                      and recompile as much packages as you want.

                      Also some users insinuated that you can't compile software on Ubuntu (debian based) systems but well that just sounds insane, you can always

                      apt-get install build-essential
                      wget sources.tar.gz
                      tar -xvzf sources.tar.gz
                      cd sources
                      ./configure --prefix=/home/myuser/mysoftware
                      make && make install

                      But it is pretty easy to just apt-get source package, modify as needed and compile. Too much science for guru's out there willing to waste some time?

                      Anyway I use Xubuntu for desktop, and Debian for servers so I can focus on the real work. It doesn't matter if my system is 500 milliseconds slower
                      You shouldn't need to create own package. On rolling distros, packages are usually already created. You install them with one command which is usually automatized.

                      Comment

                      Working...
                      X