Announcement

Collapse
No announcement yet.

A 2024 Discussion Whether To Convert The Linux Kernel From C To Modern C++

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by sdack View Post
    C++ was and still often is seen this way. It is mostly people who do not understand the need for the kind of code control a large project has and thus find its features "overkill" or "abominable" or whatever.
    I'm not so sure... I remember quite a few of them having much more regard for ObjC and (much, much later) C#. There was something about the design of C++ (or how it's usually used) that was particularly odious. These are casual conversations from 25+ years ago, so I don't even remember the details.

    EDIT: Actually, your anecdote about the person doing silly things... I kinda have a vague memory of that being part of it. C++ was too easy to be abused in dumb ways, and some of the other object-oriented C variants mitigated that, somehow. Again, memory is hazy.
    Last edited by mercster; 10 January 2024, 08:47 PM.

    Comment


    • Originally posted by sdack View Post
      What you need are people who can program in C++, not just C, and understand its benefits.
      I would bet there are much more C++ programmers. And many modern C++ programmers cannot program C.

      Comment


      • Originally posted by patrick1946 View Post
        I would bet there are much more C++ programmers. And many modern C++ programmers cannot program C.
        We are still only talking about the use of C++ in the kernel. Nobody is saying that every C++ programmer now wants to write kernel code. If you are having such a hard time staying focused then maybe step out for a while.

        Comment


        • In my career as an embedded programmer for real-time systems, c++ was/is seen as just extra baggage that hid things from the programmers that actually makes things 'harder' to maintain. C was/is closer to the hardware ... what you see is what you get. Memory/flash footprint was important .. as was speed. All the 'nice' features with C++ come with a price and can easily be abused. I don't know what it is about some programmers that have to use 'every' feature of a language to make it obtuse for the next guy. Just stick to the 'simple' OPP concepts where 'it makes sense'. KISS. I've run into this in my current job and it's a 'headache' to maintain. C is much much easier to maintain.

          Comment


          • Originally posted by ssokolow View Post

            It's certainly tactless, but "Go [is not] a suitable replacement for C" for the same reason neither Java nor Haskell is a suitable replacement for C: It's got a mandatory garbage collector. That's also why "Go is not even in the same class as..."
            but that's what I wrote in first place.

            Comment


            • The big problem with C++ imo is the direction it is taking with meta programming.

              Rather than making it a generic first class citizen that works with the language constructs, they decided to bloat the standard by 3 times and have one single specific std:refab to do each different thing.

              C++ is treating identifiers like keywords now. So you have to learn a tremendous amount of stuff to actually be proficient with it, memorize a bunch of stuff which should have been possible to do just logically do with the core language.

              At this point of time, I find it easier to implement advanced features in C than in C++.

              Because I know how stuff works, I can do it with the pointers and memory addresses and macros, but with c++ I have to take the time to look up that particular one prebaf that does what I want, study how it works, live with its limitations and hope I don't hit a design dead end down the line due to std:refab().limitations...

              And if you want to do something that doesn't have a std:refab for it, you are screwed anyway. Stuff that C++ is just bringing out... is stuff I've been doing manually for years, so their offer is now for the language to do it for me so long as I conform to its vomit inducing meta programming philosophy.

              Comment


              • As someone who is very familiar with C++11/14/17/20, this "modern C++ is so much better" meme needs to stop.
                1. Let's get real. Everything people hated about C++98 is still there in the latest versions, because backwards compatibility is the name of the game in standards commitees. It's just been supplemented with a lot of extra optional stuff, that the vast majority of C++ code in the wild does not use, and for the most part is never going to use because any non-trivial code rewrite (e.g. changing a widely used pointer type to some smart pointer type) is a lot of risky code churn with no immediate user-visible benefit, and the MBAs who take those decisions in software companies think quarterly not long term.
                2. The C++ commitee got better at releasing new standard versions, but certainly not at carefully designing the features inside of them so that they are actually usable by normal people. Each and every one of them is full of new gotchas that make them hard to use and increase the mental load on someone trying to read and write code that uses them. Here are a few examples just from C++20 :
                  • The stated goal of C++20 concepts was to make sure the interface contracts of templates are properly checked, resulting in compiler errors related to improper template uses pop up at interfaces rather than inside of implementations. However, as designed, C++20 concepts only check the interface contract on the template user's side (the user of the template must pass in a type that satisfies the concept), and not on the template implementor's side (the author of the templated code should use the operations that the concept specifies and these operations only). Therefore, like documentation, concepts are not kept in sync with the code whose interface contract they are supposed to model through compiler verification, and the concepts will thus inevitably silently fall out of sync again and again with the code they're supposed to describe as the codebase evolves, resulting in a perpetual return of the horrible compiler error messages that concepts were supposed to fix.
                  • Many were expecting C++20 coroutines to be a viable alternative to generators and async/await in other languages. But alas, they were designed to require a memory allocation on initialization and dynamic dispatch on every use, under the premise that modern compilers will surely know how to elide this overhead when needed. Breaking news : as anyone who's actually tuned C++ code for performance knows, even today's best optimizing compilers are very bad at eliding memory allocations and dynamic dispatch. And thus, this shitty design made C++20 coroutines completely useless for all use cases where fast initialization and call/return performance actually matters.
                  • C++20 modules could have followed the examples of almost every other programming language out there and used a simple mapping between source file name and module name. But no. The commitee had to mess this up too. So they decided that a source file can contain any number of modules, named however you want. Net result : when a compiler sees a reference to a module in a source file, it has no idea in which other source file of the codebase the implementation of the module may be defined. And neither does the C++20 practicioner, for that matter. But unlike with devs, it is not acceptable for compilers to grep every unknown module name across the entire CPATH due to the disk I/O overhead this incurs. As a result, in order to get modules to work efficiently, GCC had to to implement a networked database system to expose the knowledge that it acquired from parsing previous source files of which modules lie in there, and build systems are now in the process of being deeply reworked in order to interface with this networked database system thing. All this just to cover up yet another instance of C++ commitee stupidity.
                  • constexpr + consteval + constinit, seriously ? I know the commitee people fucked up at teaching the world what constexpr functions were about (namely making these functions usable in a context where compile-time evaluation is required, as opposed to either forcing the compiler to evaluate them at compile time or being a necessary step for compile-time evaluation as most people believe). But there was no need to butcher the language further to cover-up this teaching mistake, where a little education about CTFE would have done it AND hopefully put an end to the ongoing constexpr insanity where C++ devs over the world are making every function constexpr for no good reason under the belief that this will speed up their code (spoiler: it most likely won't). And please, don't get me started on std::is_constant_evaluated(), which is a minefield of non-reproducible bugs waiting to happen.
                3. While the C++ commitee folks are busy botching the above, relatively advanced features, they keep postponing the fix to the everyday problems of real-world C++ code. Things like...
                  • The lack of a commonly agreed upon way to handle errors (low-level people won't want to hear about exceptions, high-level people won't check their return codes and errno, mix the two and you have a disaster waiting to happen)
                  • A bunch of incompatible build and dependency management systems, each shittier than the other
                  • UB everywhere as the default way to do things
                  • Completely different design of run-time vs compile-time polymorphism which means that it is an absolute PITA to switch from one to the other when the ever-shifting balance between optimizing for compile-time and run-time overhead demands it
                  • Poor support for extending third-party types, which is by far the easiest way to pass a third-party object implementing an interface A into a third-party component expecting an independently specified interface B which is semantically equivalent but not API-equivalent.
                  • Poor support for code analysis and transformation (or some other kind of compile-time reflection), which you don't directly need yourself but your favorite arbitrary object persistence library needs it in order to stop doing horrible things like embedding a compiler or parsing the textual output of compiler-specific debug features.
                4. Because of this continued history of bad design which make C++ features very hard to learn, most C++ developers do not attempt to understand the entire language, but instead cherry-pick features that they find cool and learn just that. This may seem fine in one-man projects or small teams of people from similar backgrounds, but it completely breaks down as soon as more than two people from different backgrounds end up working on a C++ project, because it results in a dialect effect where one person's code cannot be understood by another person. For example, in my team, I have people whose C++ dialect of choice is basically C + overloading + RAII, people whose C++ dialect of choice is Java in worse syntax, and people whose C++ dialect leans heavily on C++11/14/17 features with a few sprinkles of C++20. These people do not understand each other's code, nor do they understand each other when they discuss C++ for that matter, and I think this is a serious problem that will only get worse if the C++ standard goes further on its current track.
                5. Because few people feel like they understand even a reasonable subset of modern C++, even fewer people feel comfortable enough with it to teach it. So the vast majority of C++ courses out there are not teaching "idiomatic modern C++", they are at best teaching C++98 with some extra newer standard sprinkles added (and at worse teaching C with some tiny bits of C++). Therefore, the average C++ developer you get will have no knowledge of the newer constructs, and no easy way to learn them. If you want a team of competent modern C++ devs, expect to expend as much effort and money retraining them as you would teaching them to use a completely different language.
                6. Since the C++ commitee keeps messing up, and almost never removes failed past experiments as the only user experience thing they do care about is backwards compatibility, the number of ways to do the same thing explodes over time, and so do unforeseen interactions between features. For a classic example, any nontrivial combination of implicit conversions, multiple constructor syntaxes (+ the most vexing parse that motivated them), function overloading, default arguments, operator overloading, virtual method overriding, templating and specialization, makes it extremely hard to know ahead of time which function your program is going to call. Non-experts only get to know by stepping through it in a debugger (or some moral equivalent), and even experts get it wrong regularly. Wasn't one of C++'s selling point against dynamic languages like Python supposed to be that it has behavior that's predictable ahead of execution?
                7. I will give one thing to modern C++ advocates though. Learning it does feel like learning a new language, in terms of amount of effort you have to spend to get there. But my question is, if you are going to learn a new programming language, why will you learn a crufty, poorly designed superset of a bad language that you will still need to live with in some parts of your codebase until the end of times, when you can instead go for some fresh take on language design that actually fixes some longstanding C and C++ design issues without keeping all the existing cruft and adding a bunch of new one along the way?
                In my opinion, as you may guess from the above rant, "modern C++" is a mistake.

                C++ pushed a lot of innovation for its day, and along the way also made a lot of mistakes due to the concepts being so new. It accumulated a lot of cruft over time as a result, to the point of reaching the limits of human intelligibility. That's fine, we are programmers, we know this situation, we end up there all the time. It's called legacy code, except in C++'s case it's legacy standardese which is slightly different in some respects like the level of exigence for backwards compatibility.

                When a legacy codebase has accumulated so much cruft that it has becomes unintelligible and unmaintainable, what should you do ? Should you keep expanding the codebase with more and more hacks to work around the problems that the previous code introduced ? Or should you wind down effort on the legacy codebase to the minimum required for basic maintenance, and instead focus most of your energy on rewriting the dying codebase to remove the cruft and make it intelligible again ? While the first option may seem easier in the short term, the second option is always the best in the long run. And in a programming language standard context, since we can't fix existing standards, this means moving to new programming languages that learned the lessons of their ancestors, as opposed to desperately trying to fix the broken old standard in a fashion that just keeps making it worse in just as many areas as it makes it better.

                The C and Fortran standards committee have encountered the same problem as the C++ commitee, and in my opinion they have handled it a lot better. Instead of trying to fix the problems of existing features with new far-reaching and redundant features, they have largely put their respective language in maintenance mode, gradually increasing the bar for new additions to the point where new releases are mostly made of laser-focused polish features that fix specific, well-understood problems, in an actually satisfactory way. I think this is what C++ should do, but it's not as if this were a popular opinion in C++ circles
                Last edited by HadrienG; 11 January 2024, 06:37 AM.

                Comment


                • Care somebody explain the ways in which current "concepts" are better than doing the seemingly logical thing:

                  SomeInterface { declare whatever the use case requires };

                  template <SomeInterface T> ...​

                  The std committee doesn't appear to work on the "lets figure what's best for the language from the perspective of the user" principle, it is more of a "lets figure what's best for the language from the perspective of the companies, industries and their needs and proprieties"...

                  And that does not a "user friendly" language make​.

                  And an increasing amount of that has to be carried by the compiler, thus relinquishing control over what actually goes down to the hardware.

                  For example, I was recently and unpleasantly surprised by the c++ compiler's inability to optimize away a synchronous lambda passed into a std::function slot. The standard doesn't "promise" that, but I know and trust c++ as the language, whose compiler will apply every possible optimization in a static, perfectly optimizable context.

                  But no, it trips over and generates an obscene amount of overhead, and is in fact the reason why std::algos all actually take template parameters, because the language can't resolve its own stuff properly otherwise. This in turn creates the problem of template opaqueness, which they now resolve by throwing an entire new concept of Concepts in the language and a whole bunch of additional syntax and givens you have to abide to just to help carry it through through its own obtuseness. When all of that could have been facilitated with the language current syntax, on compiler level.

                  They create a whole bunch of problems to "solve" a single "problem"...

                  C++ is no longer a general purpose language, it is a steaming pile of purpose specific trash. The sheer unnecessary bloat prohibits it from being useful in the general purpose context for a single developer. You have to have a whole crack team of purpose specific cogs to squeeze the general purposeness out o fit.

                  And this introduces an even greater divergence down the line, which actually makes it harder to contribute and maintain.

                  Seems like C++ features are not that much the product of intelligent design, but some malignant growth that has to somehow manifest technically and comes out in some form of brute force ground meat from a creature that may have been a general purpose programming language. It is almost like a bad GPT hallucination.

                  And that's actually not exclusive to C++, that's a pattern clearly observable across the industry. Seems like the best thing that can happen to a "good old technology" is to be kept in tact and not improved in the slightest on the user side of it.
                  Last edited by ddriver; 11 January 2024, 04:51 AM.

                  Comment


                  • I think starting from C++14 is a good idea.

                    But they must limit the grammar to a certain range. Otherwise, I would rather they rewrite the entire kernel with rust.

                    Comment


                    • Now that's what I've been saying for years. Allowing C++ feature-by-feature would have been so simple! Compare that to the twists and backflips necessary just to get a bit of rust. With his legendary rant Linus Torvalds has done harm and disservice to Linux: hoards of dimwits parroting his words over decades (some even today in this very thread).

                      Allowing rust but disallowing C++ is pure idiocrasy.

                      Comment

                      Working...
                      X