Announcement

Collapse
No announcement yet.

Red Hat Enterprise Linux 6.5 Preps New Capabilities

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Honton View Post
    You assert Debian wants to keep the old default. That is wrong.
    Really? So how about giving us some links that prove your claim? I receive the Debian newsletters and not in one of them a change to systemd was announced, so maybe I am missing something or you are just making things up again.

    Comment


    • #32
      Originally posted by Vim_User View Post
      Really? So how about giving us some links that prove your claim? I receive the Debian newsletters and not in one of them a change to systemd was announced, so maybe I am missing something or you are just making things up again.
      Debian hasn't switched yet because they move relatively slowly and releases are not that often but it appears that they will considering that the majority according to this survey seems either in favor of systemd or don't care about which init system is being used http://people.debian.org/~stapelberg...y-results.html http://people.debian.org/~stapelberg...-portable.html

      Comment


      • #33
        Originally posted by RahulSundaram View Post
        Debian hasn't switched yet because they move relatively slowly and releases are not that often but it appears that they will considering that the majority according to this survey seems either in favor of systemd or don't care about which init system is being used http://people.debian.org/~stapelberg...y-results.html http://people.debian.org/~stapelberg...-portable.html
        Only 43.9% are in favor of systemd, 32.2% don't want systemd as default. 23.7% say that they don't know yet, that doesn't mean that they don't care, but they have to look into it. I doubt that in the end more than 50-55% will be in favor of systemd as default, but of course I have no numbers on that.
        Anyways, saying that Debian already wants the switch is plain wrong and another one of Honton's tactics (read: lies) to promote his view.

        Comment


        • #34
          Originally posted by RahulSundaram View Post
          Debian hasn't switched yet because they move relatively slowly and releases are not that often but it appears that they will considering that the majority according to this survey seems either in favor of systemd or don't care about which init system is being used http://people.debian.org/~stapelberg...y-results.html http://people.debian.org/~stapelberg...-portable.html
          Ironically, it was the results of that same survey that caused me to form the opinion that Debian will not be switching to systemd any time soon. Debian very rarely resorts to "majority rules" decision making. Debian places great emphasis on consensus. I think 32% is entirely too high. That means that nearly a third of the survey respondents are explicitly against the direction that you think Debian is heading in.

          Comment


          • #35
            Originally posted by phred14 View Post
            Pardon me, I don't disagree with what you say, just in the way it seems to be getting commonly implemented. So please let me restate that every so slightly modified...

            "We need a solution to configuration and management that doesn't include bash scripts or, in general, opening a cli, but does not forbid using bash scripts or the cli, either."

            Back in the day, I've had to do both on order to get my systems running properly, even booting at all. I certainly appreciate all of the auto-configuring gizmos that make my system easier to run and boot, UNTIL they fail to work correctly. Then I just want to do whatever it takes to get running again, and I don't want those same gizmos, sitting in the way, obfuscating the old job I used to be able to readily figure out how to do, or even working against me, undoing my every tweak or fix. Please leave Linux hackable - it seems like there are those trying to turn Linux into Windows. It doesn't have to be that way - we should be able to have both.
            I agree with you 100% on this. The general trend is consistently making it harder to use your favorite text editors or tools like sed and awk to administer your system, and I think that's a bad approach from a useability perspective. The new tools lower the learning curve for new sysadmins, which is a good thing, but this should not come at the cost of making the traditional utilities obsolete, as those utilities do allow a sysadmin to be more productive once he or she learns those utilities.

            Comment


            • #36
              Originally posted by phred14 View Post
              Pardon me, I don't disagree with what you say, just in the way it seems to be getting commonly implemented. So please let me restate that every so slightly modified...

              "We need a solution to configuration and management that doesn't include bash scripts or, in general, opening a cli, but does not forbid using bash scripts or the cli, either."

              Back in the day, I've had to do both on order to get my systems running properly, even booting at all. I certainly appreciate all of the auto-configuring gizmos that make my system easier to run and boot, UNTIL they fail to work correctly. Then I just want to do whatever it takes to get running again, and I don't want those same gizmos, sitting in the way, obfuscating the old job I used to be able to readily figure out how to do, or even working against me, undoing my every tweak or fix. Please leave Linux hackable - it seems like there are those trying to turn Linux into Windows. It doesn't have to be that way - we should be able to have both.
              Whomever said anything about turning Linux into a non-hackable OS? I love it the way it is, it just needs a top notch, non-X based UI. Everything else should remain as it is. I wasn't trying to revise history either - maybe I wasn't quite specific as serge was... My point about Linux needing a fast gui, detached from the X legacy, still stands. Hackability isn't mutually exclusive with a GUI.

              Comment


              • #37
                Originally posted by Serge View Post
                I do believe that marketing would do far more for increasing usage share of Linux-based and *BSD OS'es than any one other thing, but the point I was trying to get across is that we do need many things, not just any one thing, I disagree with you on specifics, however.
                More than any one thing? Maybe. It really depends on what constitutes "one thing". If adobe ported all of their desktop apps to linux that would be HUGE. If OOTB linux could handle both low latency and throughput as well as osx that would be a huge feature (the fact that we have even lower latency than osx without massive throughput loss for the vast majority of workloads is already a big deal but I don't know of an entity offering that setup for the desktop that also has support). Honestly, I keep going back to latency b/c that tells you much about the reliability of your underlying system.
                We also need a really good DE. I'm not aware of one that is both fast, stable and "full featured" (for Windows user's definition of that).


                "We need standards."
                (Sorry if I am strawmanning you on this one; I wasn't 100% sure what you meant by "we need standards", so I took a guess.)
                Heh, I'm not sure either. What's clear to me, however, is that there is massive duplication of effort (yay, that old standard). Packaging format/tooling isn't a huge deal b/c, well, deb and rpm are structurally very similiar and, basically, work alike. A much bigger deal is not being able to rely on standard system-level apis. Systemd, as ericg says above, goes a long, long ways towards fixing this. Since we are talking about linux, I'm not considering the bsds (at any rate, i think they serve a different, and shrinking market, IMHO). LSB, in practice, seems to address mainly naming and placement conventions (very important, obviously) but the adopting the actual apis in the lsb would be a great thing (again, largely unneeded with systemd). As for debian/ubuntu, I think ubuntu spins are going to increasingly diverge. Mir is not helping, but systemd just offers too many advantages with standardization being a really nice bonus.

                Do you mean we need to increase compatibility across multiple Linux-based OSes ("distributions") by way of standards? Well, we have LSB and that's been a failure. LSB mandates standards that are in conflict with what distribution developers feel is the best approach. LSB was dead on arrival. Experienced standards bodies have figured out - and developed policies and operating guidelines in accordance with - that one does not simply "mandate" standards, period. Something must first become a de facto standard before it can hope to have success as a de jure standard.
                The big "problem" is that, by their nature, linux draws people who prize their independence over most anything else. The fact that most any distro can be spun to handle most any duty gives proof that those changes the other distros have felt had to made just weren't necessary. That's not to say ALL forks aren't worthwhile but anything beneath the ui frameworks needs to be quite standard (at least api standard if not abi).

                LSB tried to force consensus where consensus had failed to develop naturally. The various distributions did things the way they did because they felt that their approach was the best. For example, Debian developers and developers of distros based on Debian felt that the .deb packaging format had advantages over the .rpm format. If they did not feel this way, they would have switched to .rpm on their own, without being prompted by LSB. And on the other hand, requiring support for BOTH sides of an incompatibility, as was done with Gtk+ and Qt, just leads to bloat.
                The best consensus don't always happen naturally. Glastnost wasn't natural. It was the result of LOTS of high level diplomacy between two parties. The linux community consists of WAY more than two groups, which would argue for natural consensus, but, at a high level, the deb/rpm split seems to work well as a short hand. On one side you have debian, and the other you have fedora/suse. The big blocker with debian seems to be their insistence on keeping a bsd kernel as a drop-in when it should, arguably, be a tier 2 project. The entire community should'nt be held back b/c of something like that. How many people are using debian userspace with bsd? Some certainly, but I'd bet its less than 1%. Why hold back the entire project for those few?
                About deb/rpm. While there are some differences this has really become a religious issue, AFAICT. Gtk/Qt should be the place where differences in the system apis can hide and if thats what everyone used there'd be little problem but proprietary apps, especially art and cad apps can/do use their own toolkits. For them, having a stable system api is a must. So, what they currently have to do is package to RHEL/SLES version X. That could go away. We wouldn't have to have people repackaging (assuming its even possible) for every damn distro. Packaging is simply the largest user of resources AFAICT.


                I think that cross-distribution compatibility is very important, but I do not believe that this problem can be solved very easily. Standards by themselves will always fail to gain acceptance and adoption as long as the underlying causes of the incompatibilities are not addressed. Take, for example, FFmpeg vs Libav. Those are two projects with deeply ingrained animosity towards each other, and growing incompatibility between the two leads end users and downstream developers not directly involved with either project to take sides as well. Backing one with a standard and ignoring the other will result in the standard failing and will not have a strong impact on adoption of one over the other.

                Although the FFmpeg-Libav conflict is caused by engineers, there are other artificial incompatibilities* that are created by managers (hint: http://translate.google.com/#en/ru/peace). Driving more open source projects towards corporate control will not fix this problem. Instead, we'll trade freedom and incompatibilities born of passion for vendor lock-in and incompatibilities born of marketing / sales needs ("market segmentation", "product differentiation").

                *My appologies to any FFmpeg or Libav developers. I've read up on the projects and realize that the FFmpeg and Libav incompatibilities are not entirely artificial.
                Sorry, I haven't been following that rift closely so I can't comment.


                "Alsa needs to be fixed."
                I'm not qualified to write about whether it's ALSA or something else in our audio stack that needs to be fixed, but I don't believe it's all that important for the typical desktop user. I'm not saying that typical users don't care about audio, I'm saying that for typical users our audio is "good enough". My direct experience is that audio in Linux-based OS'es still sucks for professional use, but I think that doesn't impact typical users that much.
                Yes, alsa has issues. As its been explained to me, pulseaudio can do low latency work if alsa were made to use plls for timing. That is basically Core Audio. Having that as part of the standard desktop is a really big deal even though the number of musicians/engineers are pretty small absolutely.
                I'm sure ninez could explain this more thoroughly.


                I think we largely agree, it's just that I don't think marketing is the biggest issue right now. Once we get wayland (or something like it) working well, the open drivers in better shape (basically working as well as r600 is now, but for nvidia/intel as well), systemd more or less completed (reliable hibernation and low power usage are hugely important), AND one really good DE THEN I think a marketing campaign would be hugely beneficial. Right now I just don't think we are good enough in the areas that people will really notice.

                Comment


                • #38
                  Originally posted by RahulSundaram View Post
                  Debian hasn't switched yet because they move relatively slowly and releases are not that often but it appears that they will considering that the majority according to this survey seems either in favor of systemd or don't care about which init system is being used http://people.debian.org/~stapelberg...y-results.html http://people.debian.org/~stapelberg...-portable.html
                  You forgot to mention the discussion of moving Debian to OpenRC that occurred, there were quite a few interesting posts there.
                  Most notably, it's compatible with Linux and the BSDs and it's under a more liberal BSD license.

                  Comment


                  • #39
                    Originally posted by intellivision View Post
                    You forgot to mention the discussion of moving Debian to OpenRC that occurred, there were quite a few interesting posts there. Most notably, it's compatible with Linux and the BSDs and it's under a more liberal BSD license.
                    Systemd has far more support than any other init system in that survey. The main contenders are upstart and systemd and neither are compatible with anything other than Linux and similar in terms of licensing with upstart fairing worse due to CLA. openrc just doesn't have enough support outside of Gentoo.

                    Comment


                    • #40
                      Originally posted by liam View Post
                      More than any one thing? Maybe. It really depends on what constitutes "one thing". If adobe ported all of their desktop apps to linux that would be HUGE. If OOTB linux could handle both low latency and throughput as well as osx that would be a huge feature (the fact that we have even lower latency than osx without massive throughput loss for the vast majority of workloads is already a big deal but I don't know of an entity offering that setup for the desktop that also has support). Honestly, I keep going back to latency b/c that tells you much about the reliability of your underlying system.
                      We also need a really good DE. I'm not aware of one that is both fast, stable and "full featured" (for Windows user's definition of that).
                      I tend to think of OS X as the golden standard for a good graphical experience. I think the Metro style has great potential and will replace the desktop metaphor in graphical shell design once it has matured a little, but at the moment, I think it is so incomplete that it is actually a regression from Aero and whatever they used to call it before that. I have never been a fan of the destkop metaphor and am always curious to see projects try to break the mold of what a traditional, desktop-themed graphical shell should look like. I think the desktop metaphor was a clever attempt to make the then-nascent graphical shells more naturally intuitive to office workers, and to some extent it did have success there, but overall I think people's comfort with such "traditional" approaches is due more to familiarity with the desktop metaphor than to a natural inclination towards interacting with their computer in this way.

                      I think that somewhere out there is a much better way to graphically represent and facilitate interaction with the functionality of a computer to human beings than what has been done with the desktop metaphor. I don't know what that way is. I really liked where Maemo was going with their graphical experience. (note: I am referring to Maemo before Maemo 5; I have never tried Maemo 5 and have no idea what it looks like, so I can't comment on whether or not I think it's an improvement on pre-5 or if it is worse) In fact, when I first saw Maemo, my immediate reaction was, "Finally! This is the one! This is the way graphical interfaces should have always been!" But that's a dead end now. OLPC's Sugar sounds great in concept but in practice I'm always disappointed when I check it out. So that's why I am curious to see where Microsoft takes the "Metro style". It's horrible right now, though. So for the time being the only graphical environments that I find truly impressive are those that still build on the old desktop metaphor, and of those the best I think belongs to OS X. As for Linux-based GUIs vs Windows pre-Metro GUIs, I think that the later, more mature offerings in the GNOME 3 and KDE 4 series are now roughly up to par with the one found in Windows 7, which I feel remains the best Windows GUI to date.

                      TL;DR: Ok, sorry, let me get back on topic. What I mean to say is, I think that GNOME and KDE are already comparable to the Windows graphical experience, but the real king of graphical experiences for the time being is OS X, not Windows.


                      Heh, I'm not sure either. What's clear to me, however, is that there is massive duplication of effort (yay, that old standard). Packaging format/tooling isn't a huge deal b/c, well, deb and rpm are structurally very similiar and, basically, work alike. A much bigger deal is not being able to rely on standard system-level apis. Systemd, as ericg says above, goes a long, long ways towards fixing this. Since we are talking about linux, I'm not considering the bsds (at any rate, i think they serve a different, and shrinking market, IMHO). LSB, in practice, seems to address mainly naming and placement conventions (very important, obviously) but the adopting the actual apis in the lsb would be a great thing (again, largely unneeded with systemd). As for debian/ubuntu, I think ubuntu spins are going to increasingly diverge. Mir is not helping, but systemd just offers too many advantages with standardization being a really nice bonus.
                      The best technology doesn't always win. I'd like to see wider systemd adoption, but for the time being there are compelling reasons for Debian to not make the switch, and for the time being Ubuntu continues to stand behind Upstart, and as I've already mentioned in an earlier post, over 60% of the distributions listed on distrowatch.com are either Debian or Ubuntu based. In the future, all of this can of course change. The Debian project might decide that Debian misses out on too much functionality when not using systemd and this can lead to an upsurge in interest in switching to systemd, or Canonical might decide that continued Upstart development does not provide a sufficient return on their investment and that continued development of systemd compatibility layers is just not worth the effort when a switch to systemd would eliminate the need for these layers. But those are just two speculative scenarios. Right now, it does not appear that either project is going to be switching to systemd.


                      The big "problem" is that, by their nature, linux draws people who prize their independence over most anything else. The fact that most any distro can be spun to handle most any duty gives proof that those changes the other distros have felt had to made just weren't necessary. That's not to say ALL forks aren't worthwhile but anything beneath the ui frameworks needs to be quite standard (at least api standard if not abi).
                      Not too long ago I got it into my head that I wanted a CLI-only OS in a 512MB volume to handle boot management and do system recovery without a live USB. I started with Debian, as that's what I was most familiar with, but the standard installation image failed to create an install that fit. Next, I tried Arch Linux and ended up having to delete the localization files, the man pages, and mount the package cache in tmpfs in order to get a useable system without sacrificing useful tools like procps. Then I tried Slackware and got a fully useable system with all of the nice desired tools, with no hacking, in about half the space. That, to me, is a sign of fundamental philosophical differences that have real, practical consequences*. There are many projects that I feel do not bring anything significantly new or different to the table, but I feel that the major meta-distros all have something unique and worthwhile about them. As for specialty projects like ClearOS and BackTrack / Kali: sure, any meta-distro can do what they do, but sometimes it's nice to just get something that can perform such a specialized purpose with minimal hacking.

                      *The philosophical differences that I learned about from that experiment is that Arch Linux's philosophical focus on the bleeding edge has caused the project to make sacrifices with other core philosophies like simplicity and minimalism, whereas Slackware, which values practicality over rapid technology adoption, has not had to sacrifice its own simplicity and minimalism philosophies.


                      The best consensus don't always happen naturally. Glastnost wasn't natural. It was the result of LOTS of high level diplomacy between two parties. The linux community consists of WAY more than two groups, which would argue for natural consensus, but, at a high level, the deb/rpm split seems to work well as a short hand. On one side you have debian, and the other you have fedora/suse. The big blocker with debian seems to be their insistence on keeping a bsd kernel as a drop-in when it should, arguably, be a tier 2 project. The entire community should'nt be held back b/c of something like that. How many people are using debian userspace with bsd? Some certainly, but I'd bet its less than 1%. Why hold back the entire project for those few?
                      Yes, you are right about consensus reached through deliberation. The XDG/Portland/freedesktop.org standards, for example, required both GNOME and KDE developers to make compromises in the name of interoperability. But deliberations like that are kinda an exception. Normally, it is really hard to "force" consensus. You need various stakeholders to compromise, and the standard will fail if too many of them do not make the necessary compromises. I think it is more likely for Debian to switch to systemd, for example, out of necessity than as a voluntary compromise in the name of interoperability.


                      About deb/rpm. While there are some differences this has really become a religious issue, AFAICT. Gtk/Qt should be the place where differences in the system apis can hide and if thats what everyone used there'd be little problem but proprietary apps, especially art and cad apps can/do use their own toolkits. For them, having a stable system api is a must. So, what they currently have to do is package to RHEL/SLES version X. That could go away. We wouldn't have to have people repackaging (assuming its even possible) for every damn distro. Packaging is simply the largest user of resources AFAICT.
                      Well, packaging is traditionally the responsibility of each distro and its packagers, not of upstream projects. The reason things have turned out that way is because for the longest time, the proprietary software vendors shunned free operating systems, so as a result the GNU project, the *BSD projects, and the Linux ecosystem developed a very rich library of free and open software that provided reasonable alternatives to the proprietary software. So packaging for a specific distro was a task best handled by the distro in question. But then proprietary software vendors start expressing an interest, and the established way of having the distros handle their own packaging is no longer an option because the source code is not available, and in that model it's the upstream vendors themselves who end up having to package for the various distros. See, it's a manageable problem as long as the distros have access to the source code. Then the distros can just say, "Well, 'foo' works on Ubuntu, but it takes us extra effort to package 'foo' for our distro because we're not completely compatible with Ubuntu so we have to solve some problems in order to get 'foo' to run on our system. Is it really worth the extra effort, or should we instead focus on removing the differences between our distro and Ubuntu so that it's easier to get programs like 'foo' running on our distro with less effort?" And they can then decide if their differences are really that important or not, because they're the ones who are having to pay the price of incompatibility. But when it's upstream that has to do the packaging, then it's not the distros' problem anymore, so the distros are not as motivated to remove the differences anymore, either. But I'm a supporter of free and open source software, and I try to avoid using proprietary software as much as possible, so to me the only interest in making it easier for proprietary software vendors to make software for Linux-based OS'es is in that it might bring greater attention to the ecosystem at large and hence lead to the improvement of the free and open software as well. So I'm in favor of seeing more proprietary software running on free OS'es, but it's not as important for me given my priorities, and I don't like the idea of compromises to free OS'es that mainly benefit proprietary vendors.


                      I think we largely agree, it's just that I don't think marketing is the biggest issue right now. Once we get wayland (or something like it) working well, the open drivers in better shape (basically working as well as r600 is now, but for nvidia/intel as well), systemd more or less completed (reliable hibernation and low power usage are hugely important), AND one really good DE THEN I think a marketing campaign would be hugely beneficial. Right now I just don't think we are good enough in the areas that people will really notice.
                      The main reason I am still kinda hung up on marketing is because I think Linux-based and *BSD OS'es already have so much to offer to the world's computer users. I think that there is already so much here, there is definitely enough material for marketing experts to sink their teeth into if there was financial incentive for them to do so.

                      I do think that Valve's attention to a Linux base, and the upcomming Steam OS, are doing a heck of a lot of the marketing that I'm hoping for. Google Chromebooks benefit from marketing. Presumably Intel's Tizen laptops will be backed with marketing efforts from Intel as well. All of this kind of marketing raises user awareness for Linux-based OS'es in general, and projects like Debian who rely on no budget and a tiny team of volunteers for their marketing will end up benefiting from this as well.

                      So part of the reason why I keep coming back to marketing is because I think it's important, part of the reason is because I think we already have so much that is worth marketing, and finally part of the reason is because I see that marketing coming our way already. (And that's good news!) But it's as you say, otherwise we largely agree. There's plenty of room for improvement across the board, and I don't think that efforts in one area, like marketing, are mutually exclusive with efforts in another area, like continued efforts at standardization.

                      Comment


                      • #41
                        Argh, I feel like I'm writing a feaking book here :P

                        Comment


                        • #42
                          Originally posted by Serge View Post
                          I tend to think of OS X as the golden standard for a good graphical experience. I think the Metro style has great potential and will replace the desktop metaphor in graphical shell design once it has matured a little, but at the moment, I think it is so incomplete that it is actually a regression from Aero and whatever they used to call it before that. I have never been a fan of the destkop metaphor and am always curious to see projects try to break the mold of what a traditional, desktop-themed graphical shell should look like. I think the desktop metaphor was a clever attempt to make the then-nascent graphical shells more naturally intuitive to office workers, and to some extent it did have success there, but overall I think people's comfort with such "traditional" approaches is due more to familiarity with the desktop metaphor than to a natural inclination towards interacting with their computer in this way.

                          I think that somewhere out there is a much better way to graphically represent and facilitate interaction with the functionality of a computer to human beings than what has been done with the desktop metaphor. I don't know what that way is. I really liked where Maemo was going with their graphical experience. (note: I am referring to Maemo before Maemo 5; I have never tried Maemo 5 and have no idea what it looks like, so I can't comment on whether or not I think it's an improvement on pre-5 or if it is worse) In fact, when I first saw Maemo, my immediate reaction was, "Finally! This is the one! This is the way graphical interfaces should have always been!" But that's a dead end now. OLPC's Sugar sounds great in concept but in practice I'm always disappointed when I check it out. So that's why I am curious to see where Microsoft takes the "Metro style". It's horrible right now, though. So for the time being the only graphical environments that I find truly impressive are those that still build on the old desktop metaphor, and of those the best I think belongs to OS X. As for Linux-based GUIs vs Windows pre-Metro GUIs, I think that the later, more mature offerings in the GNOME 3 and KDE 4 series are now roughly up to par with the one found in Windows 7, which I feel remains the best Windows GUI to date.

                          TL;DR: Ok, sorry, let me get back on topic. What I mean to say is, I think that GNOME and KDE are already comparable to the Windows graphical experience, but the real king of graphical experiences for the time being is OS X, not Windows.
                          First, I TOTALLY agree with you about there being a better graphical paradigm than the desktop. I was really hoping it would be GS. As I've said many times previously the original G3 design doc had some really interesting ideas. The problem came down to implementation and lack of really creative types along with no UX experts. The UX situation has been getting better as they've started bringing in more folks and accepting criticism better, but the paradigm they've built isn't better enough and it's too late for them to do anything about it.
                          My only experience with maemo comes from my old n800. I have to say I wasn't a big fan. The hardware had some nice features but the software interface was not well suited to touch.
                          OLPC's sugar is an interesting one. That is genuinely different (if perhaps not unique) and is highly targeting their realistic audience (unlike GS's absurd personas).I tried sugar a few years ago and had some issues with it but, overall, that seemed a genuine advance in thinking. I also liked the old moblin interface (back when it was based on clutter and gtk). They were using categories at the top and interesting symbolic icons, along with developing their own toolkit (mx). If you haven't tried it you might be surprised (it was always buggy, though).
                          The problems I had in mind with Gnome/KDE/Enlightenment/Unity/etc is their completeness. You HAVE to have graphical sysadmin tools (even if not intended for enterprise) b/c problems will occur and clis, as they are generally made now, are just not discoverable (a few exceptions are fish and final term, along with an older project from Colin Waters that built a shell that made heavy use of python instead of bash and had significant graphical capabilities---those projects are all pointing the way to the future of graphical clis and the ideas really need to be brought to fruition).
                          I haven't played with metro as much. My biggest concern is the "root-less" aspect of moving around. I believe you need a root to move from. A "homescreen" is a great way to act as a launchpad to activities. Eventually I'd love to move away from even that, but, for now, the rootless aspect of metro bugs me. To be clear, when I say rootless I am referring to the homescreen being something that you freely move side to side (I'm not a big fan of side-to-side screen movement b/c it leaves you disoriented and without a fast way to return to a specific place) without a SINGLE frame that acts as HOME. Rather it is home extended across a navigable strip of uncertain length.
                          That nit aside, metro is incredibly interesting and I think it could be pointing a way forward.
                          OSX works well, but they've really stagnated over the last five or so years. Expose was a really nice idea, as was quicksilver (not apple's invention, but still developed FOR osx) but other than those I struggle to think of very useful, and unique, ui elements.



                          The best technology doesn't always win. I'd like to see wider systemd adoption, but for the time being there are compelling reasons for Debian to not make the switch, and for the time being Ubuntu continues to stand behind Upstart, and as I've already mentioned in an earlier post, over 60% of the distributions listed on distrowatch.com are either Debian or Ubuntu based. In the future, all of this can of course change. The Debian project might decide that Debian misses out on too much functionality when not using systemd and this can lead to an upsurge in interest in switching to systemd, or Canonical might decide that continued Upstart development does not provide a sufficient return on their investment and that continued development of systemd compatibility layers is just not worth the effort when a switch to systemd would eliminate the need for these layers. But those are just two speculative scenarios. Right now, it does not appear that either project is going to be switching to systemd.
                          Looking at number of distros is less useful than percentage of users per distro, imho. I don't think debian itself has a massive user base (smaller than ubuntu, suse, fedora, mint, maybe even mageia, afaict), so we need to look more towards ubuntu. I'm not too worried about ubuntu since I don't think they'll be a force for much longer. Their move to unity, and worse, mir, has caused brought about serious problems with the spins. I think those distros will increasingly ask the question "is using ubuntu as our upstream the best choice?" I think you'll see some movement away from ubuntu/debian and towards, the hopefully accepted, new fedora ring scheme. That is basically designed to act as platform for builders. Along with them you have the excellent suse studio service and obs (the later of which fedora MIGHT be moving to as well).
                          That was mostly speculation backed by hope, but I do think it is a very possible, and reasonable, path forward that would also go a long ways towards making the linux ecosystem both more robust (by having more standard, flexible, well designed components) and a better target for proprietary development.


                          Not too long ago I got it into my head that I wanted a CLI-only OS in a 512MB volume to handle boot management and do system recovery without a live USB. I started with Debian, as that's what I was most familiar with, but the standard installation image failed to create an install that fit. Next, I tried Arch Linux and ended up having to delete the localization files, the man pages, and mount the package cache in tmpfs in order to get a useable system without sacrificing useful tools like procps. Then I tried Slackware and got a fully useable system with all of the nice desired tools, with no hacking, in about half the space. That, to me, is a sign of fundamental philosophical differences that have real, practical consequences*. There are many projects that I feel do not bring anything significantly new or different to the table, but I feel that the major meta-distros all have something unique and worthwhile about them. As for specialty projects like ClearOS and BackTrack / Kali: sure, any meta-distro can do what they do, but sometimes it's nice to just get something that can perform such a specialized purpose with minimal hacking.

                          *The philosophical differences that I learned about from that experiment is that Arch Linux's philosophical focus on the bleeding edge has caused the project to make sacrifices with other core philosophies like simplicity and minimalism, whereas Slackware, which values practicality over rapid technology adoption, has not had to sacrifice its own simplicity and minimalism philosophies.
                          You could have started with something like puppy. That's been designed to load completely into memory (with a gui, but you could always strip out the gui). I would be willing to bet you could work from debian to puppy without a great deal of trouble, but I haven't actually tried it
                          The fedora ring system I was talking about would've been hugely helpful to you. Ring 0, provides the BARE minimum for a bootable system (https://lists.fedoraproject.org/pipe...ly/186323.html).
                          So, the issue doesn't seem insoluable, but needs a robust enough design that it can accomodate the VAST majority (as I recall, fedora.next was even intended to be a base for embedded systems, eventually).
                          I'm mentioning fedora b/c that's what I'm most familiar with, and they have the most money behind them to actually enact these projects, along with people who are hugely passionate about open source and not hindered by someone like Shuttleworth.


                          Yes, you are right about consensus reached through deliberation. The XDG/Portland/freedesktop.org standards, for example, required both GNOME and KDE developers to make compromises in the name of interoperability. But deliberations like that are kinda an exception. Normally, it is really hard to "force" consensus. You need various stakeholders to compromise, and the standard will fail if too many of them do not make the necessary compromises. I think it is more likely for Debian to switch to systemd, for example, out of necessity than as a voluntary compromise in the name of interoperability.
                          FDO is a great example. It is hard, but it can be done. Moving to systemd (or at least supporting their api) ALONE goes so far to help matters that I wonder if that is at least in the back of their minds.


                          Well, packaging is traditionally the responsibility of each distro and its packagers, not of upstream projects. The reason things have turned out that way is because for the longest time, the proprietary software vendors shunned free operating systems, so as a result the GNU project, the *BSD projects, and the Linux ecosystem developed a very rich library of free and open software that provided reasonable alternatives to the proprietary software. So packaging for a specific distro was a task best handled by the distro in question. But then proprietary software vendors start expressing an interest, and the established way of having the distros handle their own packaging is no longer an option because the source code is not available, and in that model it's the upstream vendors themselves who end up having to package for the various distros. See, it's a manageable problem as long as the distros have access to the source code. Then the distros can just say, "Well, 'foo' works on Ubuntu, but it takes us extra effort to package 'foo' for our distro because we're not completely compatible with Ubuntu so we have to solve some problems in order to get 'foo' to run on our system. Is it really worth the extra effort, or should we instead focus on removing the differences between our distro and Ubuntu so that it's easier to get programs like 'foo' running on our distro with less effort?" And they can then decide if their differences are really that important or not, because they're the ones who are having to pay the price of incompatibility. But when it's upstream that has to do the packaging, then it's not the distros' problem anymore, so the distros are not as motivated to remove the differences anymore, either. But I'm a supporter of free and open source software, and I try to avoid using proprietary software as much as possible, so to me the only interest in making it easier for proprietary software vendors to make software for Linux-based OS'es is in that it might bring greater attention to the ecosystem at large and hence lead to the improvement of the free and open software as well. So I'm in favor of seeing more proprietary software running on free OS'es, but it's not as important for me given my priorities, and I don't like the idea of compromises to free OS'es that mainly benefit proprietary vendors.
                          Packaging is the area with, perhaps, the most duplication of effort. Each distro shouldn't have to repackage every damn thing. That's a massive waste of resources (if you can package apps, you can/do program), and this problem, as you note, isn't just for proprietary software. The fact that there are at least two (I'd guess many more) glibcs being packaged for every release is insane. Distros could be so much more reliable, and be able to focus on what they really want if they relied on some Common Base.

                          The main reason I am still kinda hung up on marketing is because I think Linux-based and *BSD OS'es already have so much to offer to the world's computer users. I think that there is already so much here, there is definitely enough material for marketing experts to sink their teeth into if there was financial incentive for them to do so.

                          I do think that Valve's attention to a Linux base, and the upcomming Steam OS, are doing a heck of a lot of the marketing that I'm hoping for. Google Chromebooks benefit from marketing. Presumably Intel's Tizen laptops will be backed with marketing efforts from Intel as well. All of this kind of marketing raises user awareness for Linux-based OS'es in general, and projects like Debian who rely on no budget and a tiny team of volunteers for their marketing will end up benefiting from this as well.

                          So part of the reason why I keep coming back to marketing is because I think it's important, part of the reason is because I think we already have so much that is worth marketing, and finally part of the reason is because I see that marketing coming our way already. (And that's good news!) But it's as you say, otherwise we largely agree. There's plenty of room for improvement across the board, and I don't think that efforts in one area, like marketing, are mutually exclusive with efforts in another area, like continued efforts at standardization.
                          Yes, I agree that there is currently enough for marketing people to get excited about...and they do. There are plenty of companies that cater to enterprise that spend big marketing dollars (don't forget that IBM has just pledged to spend $1 000 000 000 over the next decade on linux). Then you have something like Tivo, or roku, which are linux based, and, of course, android, but there are many others. If we're talking about DE's, however, I am simply not convinced any are quite ready yet.

                          Comment


                          • #43
                            Originally posted by liam View Post
                            ...
                            I also liked the old moblin interface (back when it was based on clutter and gtk). They were using categories at the top and interesting symbolic icons, along with developing their own toolkit (mx). If you haven't tried it you might be surprised (it was always buggy, though).
                            ...
                            Ah, damn it! I actually meant Moblin, not Maemo, when I was talking about the interface that I really liked. I always mix up Maemo and Moblin. Conceptually, the Moblin graphical interface was brilliant.


                            The problems I had in mind with Gnome/KDE/Enlightenment/Unity/etc is their completeness. You HAVE to have graphical sysadmin tools (even if not intended for enterprise) b/c problems will occur and clis, as they are generally made now, are just not discoverable (a few exceptions are fish and final term, along with an older project from Colin Waters that built a shell that made heavy use of python instead of bash and had significant graphical capabilities---those projects are all pointing the way to the future of graphical clis and the ideas really need to be brought to fruition).
                            Do you feel that something has actually changed in CLI design itself that has made CLIs less discoverable than they used to be?

                            I have always considered discoverability to be the main natural disadvantage of CLIs when compared to GUIs. CLIs simply rely more on rote memorization than GUIs do. What I think has changed is that modern software systems have become so complicated that intuitive discoverability is more important than before, so the CLI disadvantage in that regard has become more acute. More simply put: our systems are so complicated, consisting of so many layers on top of layers on top of layers all modifying each other in some way, we don't have time to memorize every CLI program's command syntax and arguments list. When systems were simpler, memorizing everything was easier.

                            I do think there are actual design trends against CLI. Take dconf and GSettings, for example. Sure, you can still parse it, cut it, etc., but it's just a little more difficult, takes a few more keystrokes on the command line or a few more lines in your script, to do the same things with GSettings that you would do with a config system that stores everything in plain text. But I'm not sure if this movement is all that endemic. GSettings aside, I don't feel like CLIs are getting worse, but rather it's the complexity of the systems that is making CLIs less appealing.


                            I haven't played with metro as much. My biggest concern is the "root-less" aspect of moving around. I believe you need a root to move from. A "homescreen" is a great way to act as a launchpad to activities. Eventually I'd love to move away from even that, but, for now, the rootless aspect of metro bugs me. To be clear, when I say rootless I am referring to the homescreen being something that you freely move side to side (I'm not a big fan of side-to-side screen movement b/c it leaves you disoriented and without a fast way to return to a specific place) without a SINGLE frame that acts as HOME. Rather it is home extended across a navigable strip of uncertain length.
                            That nit aside, metro is incredibly interesting and I think it could be pointing a way forward.
                            Yeah, "potential" was the first word that popped to mind when I first tried out Metro. As it stands right now, it is terrible.


                            OSX works well, but they've really stagnated over the last five or so years. Expose was a really nice idea, as was quicksilver (not apple's invention, but still developed FOR osx) but other than those I struggle to think of very useful, and unique, ui elements.
                            I agree that OS X is stagnating. I kinda feel that way about Apple in general.


                            Looking at number of distros is less useful than percentage of users per distro, imho. I don't think debian itself has a massive user base (smaller than ubuntu, suse, fedora, mint, maybe even mageia, afaict), so we need to look more towards ubuntu.
                            Oh, I totally agree with you, but "number of distros" is the only reliable statistic we have. Some distros (Fedora and OpenSUSE, for example) put a decent amount of effort into trying to figure out how many users they have, and do so transparently enough that we can have some amount of confidence in the numbers they report, but for the most part distros are so terribly inconsistent about this - Debian doesn't even try, while Ubuntu keeps their methodology secret (so how do we know that downstream users of distros such as Linux Mint and Bodhi aren't being counted as Ubuntu users?). That's why I resort to looking at number of derivatives. And while I agree that Debian is not as widely used as its name recognition would suggest, Debian is nonetheless a very popular base for niche distros: Crunchbang, siduction, etc. - no doubt their usage share is very small, but there's so many of them that surely their users add up.


                            I'm not too worried about ubuntu since I don't think they'll be a force for much longer. Their move to unity, and worse, mir, has caused brought about serious problems with the spins. I think those distros will increasingly ask the question "is using ubuntu as our upstream the best choice?" I think you'll see some movement away from ubuntu/debian and towards, the hopefully accepted, new fedora ring scheme. That is basically designed to act as platform for builders. Along with them you have the excellent suse studio service and obs (the later of which fedora MIGHT be moving to as well).
                            That was mostly speculation backed by hope, but I do think it is a very possible, and reasonable, path forward that would also go a long ways towards making the linux ecosystem both more robust (by having more standard, flexible, well designed components) and a better target for proprietary development.
                            I agree with everything except the first sentence. I feel like the derivatives momentum is against Ubuntu, but those derivatives that do decide to rebase will more likely look to Debian first, as that's what's most similar to Ubuntu right now. Down the road, I do see more Fedora and OpenSUSE based distros, but I think it will be several years before the tide really shifts.


                            ...
                            Packaging is the area with, perhaps, the most duplication of effort. Each distro shouldn't have to repackage every damn thing. That's a massive waste of resources (if you can package apps, you can/do program), and this problem, as you note, isn't just for proprietary software. The fact that there are at least two (I'd guess many more) glibcs being packaged for every release is insane. Distros could be so much more reliable, and be able to focus on what they really want if they relied on some Common Base.
                            I don't think we necessarily need a single "Common Base". In a way, that's already what Debian does for its direct derivatives and Ubuntu derivatives. Ubuntu makes changes where Ubuntu developers disagree with the course taken by Debian developers (such as packaging newer versions of major software like Apache and Firefox), but for the most part the vast majority of Ubuntu packages are directly merged from Debian. I haven't seen any recent stats, but given that it was like 92% three years ago, I wouldn't be surprised if it's still around 85-90%. Hopefully developments in Fedora and OpenSUSE will lead to even more convergence on common shared bases.


                            Yes, I agree that there is currently enough for marketing people to get excited about...and they do. There are plenty of companies that cater to enterprise that spend big marketing dollars (don't forget that IBM has just pledged to spend $1 000 000 000 over the next decade on linux). Then you have something like Tivo, or roku, which are linux based, and, of course, android, but there are many others. If we're talking about DE's, however, I am simply not convinced any are quite ready yet.
                            Well, I guess I have to agree, because although I do feel like our DE's are competitive with Windows when it comes to integrated features, I haven't had good experiences showing them to novice-level users who are more comfortable with Windows. But the various distros all have original features that, if combined together in a single distro, would make a great entry point. Basically, like what Ubuntu used to be before Unity and adware. So I think we're pretty close.

                            Comment


                            • #44
                              Originally posted by MartinN View Post
                              RHEL7 comes bundled with Gnome on Wayland!
                              You must really hate RHEL. Gnome would kill it.

                              Comment


                              • #45
                                Originally posted by Serge View Post
                                Ah, damn it! I actually meant Moblin, not Maemo, when I was talking about the interface that I really liked. I always mix up Maemo and Moblin. Conceptually, the Moblin graphical interface was brilliant.
                                Yeah, it was. I'd really like to see someone pick it up and work with it. I know it was designed for very screen constrained environments but I'd be curious to see if that could be extended, in some sense, to accomodate greater use cases. I wonder how that hub might work as a homescreen?



                                Do you feel that something has actually changed in CLI design itself that has made CLIs less discoverable than they used to be?

                                I have always considered discoverability to be the main natural disadvantage of CLIs when compared to GUIs. CLIs simply rely more on rote memorization than GUIs do. What I think has changed is that modern software systems have become so complicated that intuitive discoverability is more important than before, so the CLI disadvantage in that regard has become more acute. More simply put: our systems are so complicated, consisting of so many layers on top of layers on top of layers all modifying each other in some way, we don't have time to memorize every CLI program's command syntax and arguments list. When systems were simpler, memorizing everything was easier.

                                I do think there are actual design trends against CLI. Take dconf and GSettings, for example. Sure, you can still parse it, cut it, etc., but it's just a little more difficult, takes a few more keystrokes on the command line or a few more lines in your script, to do the same things with GSettings that you would do with a config system that stores everything in plain text. But I'm not sure if this movement is all that endemic. GSettings aside, I don't feel like CLIs are getting worse, but rather it's the complexity of the systems that is making CLIs less appealing.
                                I don't think clis have really changed much at all (maybe the main difference is tab-completion, along with deep tab-completion). As you say, clis are inherently less discoverable as compared to guis, but there are some clis, like the ones I mentioned, which attempt to make some serious progress towards rectifying that to some extent (again, colin walters' was especially ahead of its time, but final term looks very promising as it's being built with vala and intended to make full use of gtk-clutter along with including the easy extensibility of fish).
                                Regarding dconf/gsettings, I think you're right that they (and something like dconf-editor for the gui) greatly help modernize/simplify system management. It's very much in the spirit of Pottering, whose vision for linux i think very highly of.


                                Oh, I totally agree with you, but "number of distros" is the only reliable statistic we have. Some distros (Fedora and OpenSUSE, for example) put a decent amount of effort into trying to figure out how many users they have, and do so transparently enough that we can have some amount of confidence in the numbers they report, but for the most part distros are so terribly inconsistent about this - Debian doesn't even try, while Ubuntu keeps their methodology secret (so how do we know that downstream users of distros such as Linux Mint and Bodhi aren't being counted as Ubuntu users?). That's why I resort to looking at number of derivatives. And while I agree that Debian is not as widely used as its name recognition would suggest, Debian is nonetheless a very popular base for niche distros: Crunchbang, siduction, etc. - no doubt their usage share is very small, but there's so many of them that surely their users add up.
                                Fair enough


                                I agree with everything except the first sentence. I feel like the derivatives momentum is against Ubuntu, but those derivatives that do decide to rebase will more likely look to Debian first, as that's what's most similar to Ubuntu right now. Down the road, I do see more Fedora and OpenSUSE based distros, but I think it will be several years before the tide really shifts.
                                I realise it's a bit of a long shot, but I just keep hoping that debian will either "pick a side" (I've nothing against bsds, but linux does have features that don't exist on bsds and that fact is holding debian-linux back, and thus many of the derivatives), or re-modularize (so as to allow different display servers, or init systems between kernels).
                                Re-basing on suse would be fine as well since I think their external tooling is superior to, well, everyone else's. The reason I would hope they'd choose fedora is, however, that fedora has the biggest money behind it so there is no danger of it going away. Along with that you have a very, very strong commitment to upstream and maintaining a pure open source experience. Those things, along with their re-prioritization of where qa resources should go, and careful definition of deployment units really make them an excellent basis for solid, and versatile, respins.


                                I don't think we necessarily need a single "Common Base". In a way, that's already what Debian does for its direct derivatives and Ubuntu derivatives. Ubuntu makes changes where Ubuntu developers disagree with the course taken by Debian developers (such as packaging newer versions of major software like Apache and Firefox), but for the most part the vast majority of Ubuntu packages are directly merged from Debian. I haven't seen any recent stats, but given that it was like 92% three years ago, I wouldn't be surprised if it's still around 85-90%. Hopefully developments in Fedora and OpenSUSE will lead to even more convergence on common shared bases.
                                By common base, I mean low level expectations that developers can rely on (so, an api at a minimum, and, for lts type releases, abi would be great). Of course, for something like embedded systems, these types of requirements could be more flexible.


                                I think Mint has done a pretty good job in the past (haven't used them recently). They make many, many things very easy. Something recent that fedora/gnome has added is builtin access to askfedora. It's a very friendly website established by Rahul, IIRC, that, well, lets people have their questions answered. The builtin aspect is that there is a client that will let you, I think, query, submit and receive responses in a more friendly way than irc (I say "think" b/c I'm still on GS 3.6 which only has very basic askfedora functionality).

                                Comment

                                Working...
                                X