No announcement yet.

The Linux Kernel Deprecates The 80 Character Line Coding Style

  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by Raka555 View Post

    It is actually people that are getting dummer over time.
    The average sentence length was over 60 in the 1500's and kept declining over time.

    "Modern" humans get a cognitive overload when you go past 14 words in a sentence today:
    (It is probably even shorter than that today)

    No wonder the poor programmers can't cope with added cognitive load of manual memory management on top of that today...
    Although Idiocracy is a favorite film of mine, that statement is total rubbish. Sentence length is not a pure function of intelligence.
    Social climate is a function of sentence length. We used words to paint a prettier picture a decent while back.
    It was needed to fill the blanks for human fantasy. As this is hardly an issue today, sentences can be shorter, precise and more to the point.
    Society is different, so is language. Nobody would want to hear you blabber ages to make a point that could be made in in a third of the time required.

    Save long eloquent sentences for Shakespeare. We live in the 21st century now.


    • #52
      Originally posted by milkylainen View Post

      Totally missing the point?
      80 chars was not much of a thing because of resolution or terminal width even 25 years ago.
      It's because 80 chars actually served a purpose in forcing people to write readable code instead of nested hell and variables in novel form.
      It still does to this day. We are still talking about C and the Linux kernel.

      The issue I have with increasing this limit is when does 100 chars become the "occasional line" instead of the norm?
      To my eyes, kernel code written to exceed more than 80 chars is usually badly structured, badly nested or generally suffer in the readability department.
      Not necessarily, or possibly not at all.
      80 characters is a harsh limit that force people to write unreadable, unsustainable code because of using acronyms or shorted names everywhere and using and abusing of "small steps" or "overnesting" just because they would otherwise get a few characters over 80.
      It's also problematic when you want to write proper inline documentation, especially in a non-English language (most romane languages are more verbose than English, so it's real easy to get a 30% or 40% increase in verbosity to express the same thing, unless you don't mind raping grammar).

      It was legitimate when name length actually made a sensible difference in performance, and when code was "simpler".
      Today, it's a crutch, because programmers have to change the structuration just because of an arbitrary limit.
      A 20 more characters push may not seem much at first glance, but in my work environment, it should be easy enough to respect.
      I'm not sure 120 characters would have been a better choice actually, but I wouldn't have minded either.

      Note that I speak "in general" here, because that "80 char rule" has sadly overrun many languages, tools and frameworks.
      I have no opinion on the particular "Linux kernel" topic, because I'm aware it's far above what I can comprehend for now.


      • #53
        Originally posted by atomsymbol
        Just a note: If Linux kernel was using automatic C code formatting and automatic C code formatting was widespread then the col number wouldn't matter because developers/readers would just reformat any Linux code before editing or viewing it.
        Automatic reformatting tools are very useful for cleaning up bad code, but I would never use them without supervision like that. Lacking the ability to reason, they're inflexible, and can't cope with the idea that there are cases where breaking the formatting rules is the right thing to do. In Java-land for example, builder patterns are quite common, but I've yet to see a formatter that can handle them, because you have to understand the code to recognise that a chained series of method calls is a builder, and not just a garden-variety train-wreck.


        • #54
          Originally posted by Delgarde View Post

          Fundamentally, it's because it's bad manners to waste the time of human reviewers with trivial formatting issues... that kind of analysis is something that can easily be automated, and so developers are expected to have run the relevant tools and fixed up the formatting before submitting patches.
          Actually I disagree on the bolded part, that "automatic resolution is efficient" (what you seem to imply).

          It's actually just a clutch, but usually not great, sometimes even bad.
          The thing is, a program can easily detect where a word stops or when a method call ends, so it's easy for it to avoid creating bugs or pure unreadability by cutting words "in the middle". But that's all about it.

          It can do nothing about determining the best way to "newline" to keep codes or comments in subsets that make them the most readable possible for a human.

          For example (totally made up code so obviously does not mean anything )...
          So imaginary code that "attacks" a database on a specific table because it wants to make a query.
          Automatic cut would be after addCondition($1) and nothing else. However, the most logical cut should be both 'before first condition' and either before or after the execute depending on taste because it would better reflect the change in "what we are working on": get a "Read object", "Configure it", "Get and process results".

          Same with documentation: it's much more readable if the dev writes its sentences in a way that works around a "cut near 80", but even without that, he's better suited to know where to cut to keep the words in groups that make "going from one idea to another" the closest to "going from one line to another" if you see what I mean.

          That's why it's imo so important in practice that devs strive to respect this best practice as much as they can, BUT without getting decapitated by a program just because the optimal way of writing actually needed a few more characters than what was expected.
          -> Analytic programs should have *two* limits: the "recommended one" (just a notice to dev on "full analytic mode") and a "hard pass one" (like recommended + 20) that would indeed motivate rejection.

          EDIT: oops, just saw your post above mine. Seems we're actually in full agreement. ^^


          • #55
            Well that was about time, but still... Only very modest bump up to 100 characters? Do people still work on old 640x480 monitors or something? I can understand people using their displays in portrait mode, but even there the horizontal resolution has comfortably allowed for way more characters than 80 per row for decades.

            I mean outside of really low end machines, what machines haven't comfortably supported 80 characters per row since what, the Commodore 128? That thing came out in 1985 for crying out loud.
            Last edited by L_A_G; 01 June 2020, 09:26 AM.


            • #56
              Originally posted by kaprikawn View Post

              This is exactly the sort of attitude Linus is railing against, I have some weird, arcane setup, and everyone needs to adapt to it.

              Obligatory XKCD
              Did you *read* my previous comment?

              It boiled down to "I have no problem with this. My preferred Vim font already rules out fitting two 80-column windows side-by-side on my smaller monitors."

              The only thing weird or arcane about my setup is that I happen to be happy at less than 100 dpi, can't afford air conditioning, and don't feel the need to pay for an upgrade which would either pump more heat into the air or preclude pumping less heat into the air, all for minimal benefit.


              • #57
                Originally posted by duby229 View Post

                OT: Just my own opinion here. From my experience sitting in front of one big ass screen is terribly uncomfortable. If you really want the equivalent of multiple monitors it's way more comfortable to actually get multiple monitors.
                I certainly wouldn't upgrade until I've found or developed a comfortable tiling WM solution that fakes the Xinerama hints for a 3x2 grid of monitors.

                That said, since this machine is my everything for media, having a 54" screen would mean nicer results if I want to make an exception to the fullscreening rules, go lie down on my bed, and watch a movie.


                • #58
                  Originally posted by L_A_G View Post
                  Well that was about time, but still... Only very modest bump up to 100 characters? Do people still work on old 640x480 monitors or something? I can understand people using their displays in portrait mode, but even there the horizontal resolution has comfortably allowed for way more characters than 80 per row for decades.

                  I mean outside of really low end machines, what machines haven't comfortably supported 80 characters per row since what, the Commodore 128? That thing came out in 1985 for crying out loud.
                  I believe limiting it to 100 has more to do with concern about how making lines too long makes it more tiring to properly CRLF your eye when you get to the end of a line in word-wrapped text. It's one of the reasons bigger books usually have some mixture of multiple columns and larger fonts rather than longer lines.


                  • #59
                    Originally posted by atomsymbol

                    You won't get readable high-resolution image&video thumbnails that way. Also, captions in charts are easier to read on high-DPI displays, which in many cases avoids zooming the chart to fullscreen to make it readable, thus saving time.
                    I generally use 128x128px thumbnails. Even if they had infinite DPI, preserving their physical size would require me to take a magnifying glass to my screen to read any text in them unless they followed the deviantArt approach of showing almost nothing and scrolling when hovered. To be honest, making that argument sort of sounds like calling a fish defective because it can't climb trees, when climbing trees was never the goal in the first place and optimizing its ability to climb trees would pessimize existing things.

                    As for captions in charts, I prefer to just implement proper zoom/magnifier options (put on shoes), rather than resorting to the "cover the world in leather" approach of rendering everything at a higher DPI when, normally, things are just fine. Comix and its descendants have a nice approach to that, where you can hit G to toggle an in-application magnifying cursor which comparable to a screen magnifier's "magnifying cursor" mode, but pulls from the original source file rather than the post-scaling output being sent to the screen.


                    • #60
                      Originally posted by kravemir View Post

                      Why 120 chars would be better? Does kernel code contain very long variable names, or functions with lots of arguments?
                      Don't know about kernel per se, but longer lines are helpful to avoid having to multiline comments as well as having to split `debug('whatever')` messages.