Announcement

Collapse
No announcement yet.

The Linux Kernel Deprecates The 80 Character Line Coding Style

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    I use 72 as a soft limit, in all projects I work on. Only for my code though, and not when reviewing other people's code. I personally prefer vertical code, and when I am diffing or splitting terminals, the limit comes in handy. More text editors need to support multiple rulers. Of course, you will occasionally get a senior reviewer moaning about all the line breaks, so in that case I just make it a bit more horizontal, and say, meh, whatever!

    Also, because it's a soft limit, if I really can't get it to look clean and organised under 72 characters, then I will break this limit.

    One benefit about having your own slightly tighter limit, is that if you rename variables, you won't suddenly find that loads of lines have broken any hard limit that the project has defined.

    Comment


    • #62
      Originally posted by Citan View Post
      Not necessarily, or possibly not at all.
      80 characters is a harsh limit that force people to write unreadable, unsustainable code because of using acronyms or shorted names everywhere and using and abusing of "small steps" or "overnesting" just because they would otherwise get a few characters over 80.
      I wrote database applications with simple GUI interface back in the 1990s, and no mouse interface was necessary, but it sure did not look like a mainframe 80x25 text display console. All of my source code could be displayed on an 80x25 monitor; no shorted names, no "overnesting".

      Seriously, good code can be written and displayed on 80x25 monitors IF the programmer knows how to properly construct their code within the limits of the language and the hardware being used.

      Originally posted by Citan View Post
      It's also problematic when you want to write proper inline documentation, especially in a non-English language (most romane languages are more verbose than English, so it's real easy to get a 30% or 40% increase in verbosity to express the same thing, unless you don't mind raping grammar).
      I used to speak, read, and write German at the high school level. Yes, non-English languages could be more verbose than English, my native language. Yes, sentences in German involved more words than their English counterparts, mostly due to language structure rules and verb tenses. I even wrote a multi-page cultural research paper for my final project in German language class ... on an 80 character width TYPEWRITER. In my programming days I wrote all of my own documentation on a 80x25 monitor that printed out nicely on 8.5x11 inch (not quite A4 format) paper. So please stow your excuses about needing more than 80 characters to write documentation, unless you are trying to document some severely abstracted programming language that will probably be dead & forgotten in the next 5 years.

      Originally posted by Citan View Post
      It was legitimate when name length actually made a sensible difference in performance, and when code was "simpler".
      Today, it's a crutch, because programmers have to change the structuration just because of an arbitrary limit.
      A 20 more characters push may not seem much at first glance, but in my work environment, it should be easy enough to respect.
      I'm not sure 120 characters would have been a better choice actually, but I wouldn't have minded either.
      When programming labels influence performance I become so depressed at the drastic decline in modern day programming ability compared to decades past. Many of today's programmers would fail to understand why something incredibly critical, like the Apollo Guidance Computer that took Mankind (humans, and possibly some germs too!) to the Moon, was written in Assembly and why it's user interface in the Command Module was so simplistic by today's standards. Do the web research and learn how real programmers worked back in those days; it will amaze (or confuse) you.

      It sounds to me like you are trying to use programming labels to explain what your function/procedure is trying to do. Back in my days that sort of description was placed in a programming comment SINCE the traditional compiling process eliminates that sort of excess from the final output binary, at least in all the binaries I ever created, because the compiling process understands/assumes that sort of language verbosity is needed by the human programmer, not the machine code compiling process. Yes, some languages do include those labels in the output binary, for some silly reason, as a text segment of the EXE program. That's nothing but needless BLOAT, and something I see a lot in programmers grown up on Microsoft languages and Windows.

      Originally posted by Citan View Post
      Note that I speak "in general" here, because that "80 char rule" has sadly overrun many languages, tools and frameworks.
      I have no opinion on the particular "Linux kernel" topic, because I'm aware it's far above what I can comprehend for now.
      The 80 character limit did not overrun anything; it is something that existed before these languages, tools, and frameworks came to be. It's a holdover from previous computing technology as another post in this thread correctly points out. Heck, even some old technology worked at 132 columns; I do miss those DEC terminals.

      Comment


      • #63
        Originally posted by atomsymbol
        I think you might have forgotten to take into account the Flynn effect. The Flynn effect is in contradiction with the belief that spreading literacy to masses lowers average/median quality of knowledge.


        I don't think I forgot the Flynn effect. I think I didn't know it :-). Thank you for the link. I've found it very interesting. If I understand it, the Flynn effect seems to have platooned and could be reversing. But it would be in effect in the time lapse of the graphic I was replying to, so your point is relevant. In fact the graphic ended in 1950 and I was ranting about very much recent experiences, so I wasn't making much sense. It's possible milkylainen is right and we simply have changed our composing or punctuation since the XVI century. After all the Romans practically didn't use whitespace. Maybe it's just style.

        I don't think spreading literacy lowers quality of knowledge. I suspect that we're currently replacing literacy with some kind of written reflexes that could lower quality of knowledge. I think literacy spread since 1451 (Gutemberg's press, I'm not very sure before) until sometime recent, maybe 2004 (symbolic date, start of Facebook). We're drowning knowledge in spin, hype, and permanent interruption. Since the printing press (in a slow process, maybe accelerated since the XVIII century) everyone could read. Since the Internet (in a not so slow process) everyone can write (publish). I didn't see a problem with that in the XX century, but it has somehow turned in a kind of communication that has become centered in ego, no longer content, and that is mediated by few, centralized services that exploit attention span and endocrinology practically as a drug to try to direct mental activity to the highest bidder, and that has negative consequences on quality of knowledge.

        And sorry if I'm Eurocentric. It's not due to my malice but my ignorance.

        Comment


        • #64
          Originally posted by atomsymbol
          It is probable that within a few years the world will move to high-DPI on desktop&notebooks. 1080p 24" displays will over time become obsoleted technology. Mainstream GPUs will be able to render games at 4K/60Hz which will become the dominant gaming resolution. Netflix/etc might begin to stream from their servers movie files upscaled off-line from 720p/1080p to 4K. For light desktop use (e.g: web browsing) 4K resolution is supported by most new iGPUs. Most new images on the web will be optimized for about 150 DPI and so they will have to be downscaled by the web browser when viewed on a 90 DPI display.
          I'll cross that bridge when I come to it. What I have now suits me fine and, if nothing else, maybe I'll get a nice surge of thrift store monitors at my preferred size-resolution pairing before I have to move on.

          Originally posted by atomsymbol
          Text that has physical height of 2 millimeters (uppercase letters: 2mm, lowercase letters: 1mm, viewing distance: 50-60 cm) is easily readable on a 4K display without the necessity of a magnifying glass. The previous sentence might be true (my personal guess) for more than 75% of the world population, but I do not have concrete statistics available to determine whether 75% is an accurate figure.
          1. I sit about 90cm from my monitors and, if I found myself sitting 50-60cm away, I'd book an appointment with my optometrist because I'd be worried about needing glasses.
          2. Thumbnails containing text still seem like a bad solution struggling to justify itself to me. Outside of specialized image browsers, I don't even use thumbnails bigger than 22x22px (about 7mm to a side at my current DPI), relying on tabular/list-view modes like Detailed view mode in file managers instead, where icons only serve to identify file types without having to parse the text in the Description column.
          3. When I care about the text content of files, I use search systems which produce excerpt results formatted more in the vein of Google or ripgrep.

          Comment


          • #65
            Originally posted by atomsymbol
            It is probable that within a few years the world will move to high-DPI on desktop&notebooks. 1080p 24" displays will over time become obsoleted technology. Mainstream GPUs will be able to render games at 4K/60Hz which will become the dominant gaming resolution.
            So "it's probable" that technology will improve and high-end features will become the new normal? Wow, never saw that one coming. Thanks for the visionary tips there, bro.

            (Sent from my Pentium II with a 640x480 VGA monitor)
            Last edited by Larry$ilverstein; 01 June 2020, 02:12 PM.

            Comment


            • #66
              I am sure there are use cases where long coding lines are making sense. Strictly don't allow long lines doesn't seem like a flexible solution to me. The lines should have a maximum so they fit on 1080p screen without the need of scrolling. So I would suggest something around 180 characters.

              Comment


              • #67
                Size of monitor is not that much relevant. Human reads in a comfortable distance and view angle to screen. The maximum length of line is a fixed number regardless the monitor size. And this number is different for people with different correction lens.

                And we have to realize it is also very common people use multiple windows and tile them. So window size is really the key, not monitor size.

                Comment at end of line is also not an excuse to use long line. It is mostly better to put comment on new line.

                Comment


                • #68
                  Originally posted by atomsymbol

                  I wasn't clear by what I meant by "thumbnails". Sorry about that. I mean: image thumbnails, video thumbnails, any kind of GUI icon, youtube/netflix/primevideo/etc home pages, desktop gadgets, weather reports, workspace switcher icons in KDE/Xfce/etc. Basically, anything that is smaller than 400x400 pixels @4K, or 200x200 pixels @1080p@90dpi.
                  Ahh. I still see it as folly to design a UI where you're expected to read text in an image or video thumbnail, GUI icon, or workspace switcher icon, aside from possibly having a three-character file extension to disambiguate different archive/document/image/etc. format icons. I don't use desktop widgets aside from Conky, where I'm using a font that hints very nicely.

                  I suppose YouTube thumbnails count, but those are much bigger.

                  Comment


                  • #69
                    Ideal maximum line length is something around 100 characters. This is true for reasons of the physiology of human vision.

                    The functional area of the retina of the human eye, also known as the fovea, can register only a few degrees of arc subtended by a width of a screen at a comfortable reading distance of 40 to 60 cm (assuming appropriate corrective lenses). That's about 40 characters for a reasonably-size font in a well-designed typeface. Scanning the text horizontally requires coordinated muscle movements (which our brain is fairly good at) from one end of the text to the other. Information has to be absorbed and process during this scan. At the end of the scan, the eyes move back to the starting margin and focus is required to find the start of the next line -- focus that takes away from the processing required to understand the text. Lots of other things can also reduce this focus: poorly designed typefaces, low contrast, bad rendering, poor eyesight all contribute to reduced understanding of the text just read. Shorter text lengths make is easier to understand the line that was just read ion the face of all these negatives.

                    Too much whitespace on the left (for an LtoR script) makes it far more difficult to read.

                    700 years of typography design experience has shown the best text presentation is relatively short (3 or 4 inches, 7 to 10 cm) lines of text arranged in left-justified, ragged-right paragraphs and a serifed typeface make the easiest to read text, which yields the best comprehension.

                    All of these rules and reasons continue to apply to the literature that is computer program code. 80 columns was fine for decades, especially in the age when linkers only supported 6 character symbols. I still have stacks of punched cards from my university days. Today, 80 seems frustratingly low as names grow longer and vast programs with tens of thousands of names require global uniqueness. Still, the limits of human vision and comprehension must be taken into account. 100 characters seems like a good suggested line length. Sometimes you might still need to go over, but 100 should be more than enough to express the start of a thought that can be carried over to successive lines. Exceeding that limit should be rare and require a solid explanation to your readers.

                    Minimizing indentation and function length remain very very important aid to promoting good understanding of your ideas on a page, too.

                    Comment


                    • #70
                      Originally posted by atomsymbol
                      I hope you don't believe that just because it is obvious that a consistent theory of dark matter and dark energy will revolutionize astronomy and will become the new norm in physics makes you know how and when the revolution will happen ...
                      lol, your false equivalences are even more cringe than his visionary predictions.

                      Originally posted by Larry$ilverstein View Post
                      (Sent from my Pentium II with a 640x480 VGA monitor)
                      Originally posted by atomsymbol
                      Really?
                      Yes.

                      Last edited by Larry$ilverstein; 01 June 2020, 08:33 PM.

                      Comment

                      Working...
                      X