Announcement

Collapse
No announcement yet.

Debian Sticking With Merged /usr Plan

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by tuxd3v View Post

    He is right...the ascii files the magic number or equivalent to a ELF header is the shebang..
    The Ms Windows guys are the ones that execute files by extension..
    1. that's not what he said (he said that he needs to write a shebang that points to the binary or even a symlink)
    2. Point me to the magic number that says an ASCII file is a shell script https://en.wikipedia.org/wiki/List_of_file_signatures
    (no you can't, as there isn't any) the ELF header for any plain text file is exactly the same, it does not tell the difference between shell, python, perl, ruby, javascript, C, C#, Java or whatever the fuck else so you can't use the magic number to decide what is the best parser for your script.
    3. parsing an extension is faster than parsing the text to detect what kind of code is that.

    Comment


    • #32
      Originally posted by hreindl View Post

      and thats's exactly what get better by this change because the random differences for no practical reasons disappear
      other unix systems did that change years ago, only some stoneold linux distributions are incomaptible with the rest of the world
      Exactly, but what I meant was a future removal of the /bin, /sbin and /lib symlinks when I said "lack of compatibility".

      Comment


      • #33
        Originally posted by Kano View Post
        If you write new scripts you could use the "#!/bin/env xxx" approach for the interpreter but this is certainly not the dealbreaking issue as there will always be be the /bin to /usr/bin symlink. If you begin to check for files inside /usr which did not be there before with hardcoded path it would break however. I don't know how many scripts use this kind of existence checks but that's something that would break on unmerged systems - you have to think a little bit more, but nothing critical. A merge of bin and sbin would lead to much more incompatible scripts (for backwards compatible use).
        I doubt that future-looking distros (Fedora, Arch) will keep that compatibility for long.

        I don't see why (third party) scripts should have retrocompatibility issues. Do you really need to specify the full path to executables at all (especially in a multi-distro script)? Can't you just add a check at the beginning of the script to detect if the tools are in the right place and adjust the paths it is calling (and throw an error if nothing is found so the user will know what's wrong instead of failing hard with a weird error when it can't find the tool)?

        I mean I've been doing this already for other distro-specific positioning of configs and other stuff I need to parse in my scripts, it's no big deal.

        Comment


        • #34
          There are other ways to check than [ -x /bin/whatever ] but it is certainly used by some scripts. If somebody changes that it [ -x /usr/bin/whatever ] as "type -f whatever" shows /use/bin now (or vice versa) you would break compatibility with unmerged systems. Most likely more tricky for /use/sbin merges.
          Last edited by Kano; 06 March 2019, 01:31 AM.

          Comment


          • #35
            Originally posted by jabl View Post
            Probably never. Things like /bin/sh are so deeply embedded into Linux that removing it is more or less impossible. UsrMerge is still good as it get rids of some historical cruft, though.
            Agreed... there are literally millions - maybe billions - of shell scripts out there starting with #!/bin/sh, #!/bin/bash, etc. Merging the directories is nice, but the compatibility links are never going away...

            Comment


            • #36
              Originally posted by hreindl View Post
              most of them are in packages and it's prett easy to fix that stuff at build-time and for the rest you won't find any script on dozens of machines maintained by me not starting with #!/usr/bin/bash for years now
              No, I'm thinking of all those countless shell scripts that anyone who works on Unix systems maintains, many of which were written decades ago on non-Linux systems, any many of which still run on non-Linux systems. Some of us do still have to deal with the likes of AIX and BSD, etc... and the major Linux distros are aware of that... they're not going to break that kind of compatibility lightly...

              Comment


              • #37
                Originally posted by starshipeleven View Post
                2. Point me to the magic number that says an ASCII file is a shell script https://en.wikipedia.org/wiki/List_of_file_signatures
                (no you can't, as there isn't any) the ELF header for any plain text file is exactly the same, it does not tell the difference between shell, python, perl, ruby, javascript, C, C#, Java or whatever the fuck else so you can't use the magic number to decide what is the best parser for your script.
                The Equivalent of a Magic Number in text files are the Shebang..
                So you need to access the first line in a text file, to know exactly what kind of file it is..
                There are a lot of *people* out there that doesn't even have a clue of what they are doing..

                Using a generic extension only, in a shell script,... should be avoided at all costs!!
                Because the default shell is NOT always the same, its a shot in the dark and a very very bad practice.
                It his even dangerous to do that, and it have no portability..

                An example..
                Code:
                $ cat test.sh
                echo "UNKOWN sh file"
                
                $ cat test
                #!/usr/bin/env bash
                echo "bash file"
                
                $ file ./test.sh
                [B]./test.sh: ASCII text[/B]
                
                $ file ./test
                ./test: [B]Bourne-Again shell script, [U]ASCII text executable[/U][/B]
                Originally posted by starshipeleven View Post
                3. parsing an extension is faster than parsing the text to detect what kind of code is that.
                Parsing a generic extension is the worst you can do, since it doesn't tell you what interpreter will run the Script.
                Its simply WRONG!

                That is what the folks at XFCE have been doing in < gtksourceview >, and because of that..Editors that rely on < gtksourceview > like mousepad, cannot get the Scripting language type, and present the file has a text file, what is Wrong,
                Since it can be java, bash,korn-shell, Lua,python, and so on..

                They have a bug Open for some 10 years and yet it continues open...puff.

                That thing is what MS Windows does, and its badly wrong.
                Last edited by tuxd3v; 06 March 2019, 07:40 AM. Reason: retrieving offensive language used, and for that my apologies

                Comment


                • #38
                  Originally posted by jacob View Post

                  I felt the same for a long time. The idea of having / as a rescue system never made sense to me; if your system is upside down it's usually because the filesystem is corrupted, or a problem with the disk itself, or a sata/sas driver problem etc. Either way by definition you won't be able to boot from /.
                  If I remember correctly, the "recovery" aspect of the root vs. /usr split was designed in days when disk space was at a premium and you might want to have multiple machines mount /usr over NFS to avoid having to have multiple copies. (ie. The point is to be able to boot, diagnose, and recover when something like a network hiccup or a hardware upgrade prevents /usr from mounting over the network.)

                  Comment


                  • #39
                    This was seen on Hurd from the very beginning.

                    Comment


                    • #40
                      Originally posted by starshipeleven View Post
                      It seems the location of the shell isn't part of POSIX, they only need a POSIX shell to exist somewhere accessible. https://unix.stackexchange.com/quest...-bin-directory
                      Ahh. My mistake. I'd assumed support for shebangs was required by POSIX and, if that's the case, it'd be insane to not also provide a standard location for a Bourne-compatible shell.

                      Comment

                      Working...
                      X