Announcement

Collapse
No announcement yet.

GNOME 3 Might Be Too Resource Hungry To Ever Run Nicely On The Raspberry Pi

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by cynical View Post
    If you think avoiding errors is so important, it's surprising to me that you would be in favor of having an additional vector for mistakes rather than taking the decision out of error-prone human hands. Not having to do manual memory management is a feature. We only tolerate it for the lower level languages because we have to.
    Who's "we" again? I get it, 90% of "programmers" out there just treat it as a normal job, they don't think about code outside of their working hours. They just go to get paid, do the job as simple as possible, and be done with it. C to them is a "necessary evil" only in cases where the boss tells them to do it.

    C is for real programmers. They code because they care about the end result. Analogous to artistic integrity if you will. And these aren't only my words, obviously the guy who initially wrote the kernel shares these beliefs. You know, people who like to write code and be proud of the result, not just to get it done asap "as long as it works".

    Yes, it might have flaws, but those flaws *can* get fixed. They *will* get fixed, so it will be progressively *better* with time.

    On the other hand, there is no fix for the bloated runtimes or slowness caused by "modern languages". NO FIX EVER. Forever sucking.

    Originally posted by cynical View Post
    If we had to do everything in C, we wouldn't be using this board right now because web technology would be in the stone age. Clearly, there is such a thing as tradeoffs, and a reason why languages like JavaScript exist.
    No, we'd be using a much superior board that loads instantly like it should considering the power of the hardware these days. Obviously, it wouldn't be as quick to develop as this board from scratch, but we'd have one eventually, because people would contribute to fewer projects which end up ultimately better. Hardware doesn't improve just so software can get to shit on its efforts because software devs want to pump out "software" of crap quality.

    Of course, using C as a language in a browser is suicide, because you're not supposed to run such a powerful language in a browser. Javascript "makes sense" there, although it should be something even MORE lightweight and simple, since JS is already causing so many security issues in browsers.

    See this: http://idlewords.com/talks/website_obesity.htm

    Explains it well enough. It is amusing to see how much shit people tolerate these days when we should expect MUCH MUCH better from our hardware. If devs were forced to (i.e. C was the only available language or so), we'd have a much better world.

    Originally posted by cynical View Post
    Please educate me. As far as I know, C has to have null terminated strings because of what a string is in C (an array of chars) and the fact that it decays to a pointer. How can you know where the array ends without some kind of symbol telling you, since C doesn't handle those checks itself?
    You can always use your own macros (yes, macro abuse is legit in C, since it doesn't have constexpr metaprogramming like C++). Nobody forces you to use null terminated strings. Make a struct, use byte-prefixed strings or whatever, unlike other languages where flexibility is thrown out the window for "programmer convenience". Ergo, C as a language is superior since you can do both X and Y.

    Originally posted by cynical View Post
    In many cases, the result is not having a product at all.
    Yes. Fewer, higher quality products, would be better.

    If implementing X project from scratch would be 10x more difficult, people would band together and contribute more to existing projects, ensuring more quality. There will be security vulnerabilities but they will get eventually fixed. Constantly improving instead of "writing from scratch" means you don't throw out all the progress done.
    _______

    Don't you find it absolutely ridiculous that we get "modern" APIs like Vulkan which are akin to C in the graphics programming world, which crash at the slightest programming error (segfault or even worse, lockup the entire GPU), when we have "bloated" hand-holding APIs before such as OpenGL, which would be akin to idk maybe Python or whatever? We call this "progress" in the graphics world because we need to start squeezing out performance.

    But yet, on the other hand, CPUs have been pretty much stagnant for the past decade, with only minor performance improvements. And it's CPUs where we want even WORSE performance from software for "convenience"? To me this is completely absurd, full stop.

    Comment


    • Originally posted by Weasel View Post
      Who's "we" again? I get it, 90% of "programmers" out there just treat it as a normal job, they don't think about code outside of their working hours.
      Do you enjoy complexity for the sake of it? Time spent on memory management is time taken away from solving problems, and it introduces the potential for mistakes on the part of the programmer. All things being equal, it is objectively not desirable for a programmer to have to do it.

      Originally posted by Weasel View Post
      C is for real programmers.
      /eyeroll

      Originally posted by Weasel View Post
      Javascript "makes sense" there, although it should be something even MORE lightweight and simple, since JS is already causing so many security issues in browsers.

      See this: http://idlewords.com/talks/website_obesity.htm

      Explains it well enough. It is amusing to see how much shit people tolerate these days when we should expect MUCH MUCH better from our hardware. If devs were forced to (i.e. C was the only available language or so), we'd have a much better world.
      Nothing to do with the language. Everyone has a choice on whether they want to load their page with a million scripts/external resources or not.

      Originally posted by Weasel View Post
      You can always use your own macros (yes, macro abuse is legit in C, since it doesn't have constexpr metaprogramming like C++). Nobody forces you to use null terminated strings. Make a struct, use byte-prefixed strings or whatever, unlike other languages where flexibility is thrown out the window for "programmer convenience". Ergo, C as a language is superior since you can do both X and Y.
      How does a macro, struct, or byte-prefix replace a null terminating character for strings?

      Comment


      • Originally posted by cynical View Post
        Do you enjoy complexity for the sake of it? Time spent on memory management is time taken away from solving problems, and it introduces the potential for mistakes on the part of the programmer. All things being equal, it is objectively not desirable for a programmer to have to do it.
        But it is objectively desirable for him if he cares about the end product.

        Originally posted by cynical View Post
        Nothing to do with the language. Everyone has a choice on whether they want to load their page with a million scripts/external resources or not.
        I know, I was simply saying that people are tolerant of too much shit today out of laziness of the developers or whatever other reason. We should demand and expect better.

        Originally posted by cynical View Post
        How does a macro, struct, or byte-prefix replace a null terminating character for strings?
        I think I understand your confusion now. Strings are just an array in memory of chars. "blah" is just convenience for { 'b','l','a','h',0 }. You insist on using "blah" (constant literal string) but you don't have to because you're not limited by it. So make a macro without it (well, some compilers have extensions for wide-strings though, like the L macro). You will have to handle it like an array or whatever it is, manually, with macro giving you the field (such as length).

        With C++ it's much easier to make a generic class that can even convert -- at COMPILE time (meaning the output is identical and not bloated) -- a string such as "blah" to anything you want. Literally. C is lacking in compile time capabilities, however, but it can be done (just a bit more manually, without using constant string literals). (i.e. use constexpr class, constexpr constructor, or even the literal operator"" in C++)

        Comment


        • Originally posted by Weasel View Post
          But it is objectively desirable for him if he cares about the end product.
          I'm saying that if you could have the performance of C and a quality end product without having to deal with memory management, it would be preferable not to do memory management.

          Originally posted by Weasel View Post
          So make a macro without it (well, some compilers have extensions for wide-strings though, like the L macro). You will have to handle it like an array or whatever it is, manually, with macro giving you the field (such as length).
          That's great, you have a workaround. Now imagine if that flaw was not there in the first place and you didn't have to think about any of this whatsoever. That you could just use the normal features of the language without having to code defensively. That's what I'm talking about when I say that C is flawed. You haven't actually solved the problem, because the problem is in the language.

          Originally posted by Weasel View Post
          Yes, it might have flaws, but those flaws *can* get fixed. They *will* get fixed, so it will be progressively *better* with time.

          On the other hand, there is no fix for the bloated runtimes or slowness caused by "modern languages". NO FIX EVER. Forever sucking.
          Now you're just trolling. First of all, many of the problems in C cannot be fixed because it would break backwards compatibility. Second, it was not even feasible to use JavaScript for anything other than simple web scripts in the past, which is why you know it as a 'scripting language', yet today it is used to write entire apps, and is completely displacing languages like Java on the server (where you can always throw more computers at the problem, while you can't say the same thing about developers). That is only possible because of massive improvements to the language and runtimes. So you're objectively wrong here, and you know it.

          I actually like C, but I also really like the dynamic nature of LISPy languages. Not everything needs to be directly in touch with the hardware. I assume you like it because it gives you maximum control and efficiency, and believe that mastery over these things is a sign of skill right? I value flexibility and speed of development more. It's not an either/or thing, just a matter of taste. It all matters in the end.

          Comment


          • Originally posted by cynical View Post
            I'm saying that if you could have the performance of C and a quality end product without having to deal with memory management, it would be preferable not to do memory management.
            Yeah but that's not going to happen. "performance of C" means literally that, of course, not "20% slower is fine" or whatever arbitrary factor (also memory consumption and load time and other side-effects like halting during GC, depending on implementation).

            It's not because C is magic but these kind of things need to be programmed. Of course, an AI or toolkit could code it in the future perfectly optimally, but then it would still be C. Just because you don't code it doesn't mean it's not there and the language is still used, proving its superiority, even if not by humans (talking of hypothetical future where AI codes stuff).

            Originally posted by cynical View Post
            That's great, you have a workaround. Now imagine if that flaw was not there in the first place and you didn't have to think about any of this whatsoever. That you could just use the normal features of the language without having to code defensively. That's what I'm talking about when I say that C is flawed. You haven't actually solved the problem, because the problem is in the language.
            How is that a problem of the language? It *allows* you to do it, and it's not a workaround, you use the language specs to do it.

            Just because YOU or whoever else bashes it don't want to deal with X and Y doesn't mean the language has a problem. It has a problem when it can't be used for a specific task (cannot, not won't!), OR if it produces something slower/more bloated compared to another language (less efficient etc) or any other measurable objective factor.

            Saying stuff like "it's not convenient for me to write in it, therefore the language has a problem" is absolutely ridiculous. Maybe it's your (or whoever) problem using it, or you need to learn to use it/code properly or whatever.

            Might as well say stuff like "I don't like assignments being x = y, I'd prefer something like x <- y, therefore the language has a problem, definitely not me!!"


            No point to answer the last part since you still go on the assumption that your preferences are "many of the problems in C". But, you realize I was talking about the application, right? I mean, the application's flaws can be fixed.

            There is NO application vulnerability or code in existence, written in C, which cannot be solved. If it did, then yes, the language would have a problem. But as it stands, it's fully the application code's fault, which CAN get fixed, with CURRENT capabilities of C. On the other hand, some things are just straight out impossible to do in other languages which still rely on magic wrappers or libraries (that are implemented in C, usually, the irony is obvious). Especially languages without pointers are just horribly limited.

            And as I said, if vulnerabilities keep getting fixed, and people would contribute to pure C code projects more, we'd get incremental security, instead of "writing from scratch" tons of alternative apps due to ease of language (but also bloat), which will replicate a lot of work that went into squashing bugs.


            Can I also say that, my preference is to code with pointers (literally), a language without pointers has HUGE problems because I just can't get my head around it. Yes, it's the language's fault due to my pointer preference, and no I'm not trolling. To me, people who don't understand pointers or how easy it is to see how they work, should find a different goal in life. Coding is not for them.

            I'm just using your logic here: my preferences makes other languages inherently bad right?
            Last edited by Weasel; 06-05-2018, 08:40 AM.

            Comment


            • Originally posted by Weasel View Post
              ...

              Can I also say that, my preference is to code with pointers (literally), a language without pointers has HUGE problems because I just can't get my head around it. Yes, it's the language's fault due to my pointer preference, and no I'm not trolling. To me, people who don't understand pointers or how easy it is to see how they work, should find a different goal in life. Coding is not for them.

              ...
              I know that if a language doesn't have proper pointers, you lose some flexibility, but I think it's worth noting that many languages retain some of the power of pointers even though they don't expose real memory-pointers to the programmer. For example, in Python 2 & 3 all function arguments are sort-of, pass by reference. When you pass in an argument to a function or method, Python doesn't make a copy of that argument. Instead it passes in a reference (sort-of like passing a pointer to a function in C). So in Python you gain performance as a full copy does not have to be performed taking cpu and memory. This is in contrast to PHP (and even C infact) where it's pass by value.

              My comment above isn't meant to be an argument for or against pointers and it's not meant to suggest that pass-by-reference languages like Python can be equivalent to languages with pointers. I'm just pointing out that some of the expressiveness and performance benefits that can be gained by use of pointers can also be exposed in other ways in other languages; often safer too.

              I too like C pointers

              Comment


              • Originally posted by ermo View Post
                Personally, I had to switch away from GNOME because simply idling at my 3x1080p desktop would put a non-trivial amount of load on my system and make it feel jerky
                On my i7-4510U with integrated gfx it's the other way round: GNOME feels faster and way lighter than KDE. I've been trying every new major KDE version on a few machines but I always have to go back to GNOME.

                Fanboys apart, IMO many people just don't like Gnome's workflow (neither did I when I first tried it but I can't do without it now), and the fact that its interface seems too simple in comparison (again, now KDE feels way too cluttered to me and I don't miss any particular feature or setting).

                Comment


                • Originally posted by cybertraveler View Post
                  I know that if a language doesn't have proper pointers, you lose some flexibility, but I think it's worth noting that many languages retain some of the power of pointers even though they don't expose real memory-pointers to the programmer. For example, in Python 2 & 3 all function arguments are sort-of, pass by reference. When you pass in an argument to a function or method, Python doesn't make a copy of that argument. Instead it passes in a reference (sort-of like passing a pointer to a function in C). So in Python you gain performance as a full copy does not have to be performed taking cpu and memory. This is in contrast to PHP (and even C infact) where it's pass by value.

                  My comment above isn't meant to be an argument for or against pointers and it's not meant to suggest that pass-by-reference languages like Python can be equivalent to languages with pointers. I'm just pointing out that some of the expressiveness and performance benefits that can be gained by use of pointers can also be exposed in other ways in other languages; often safer too.

                  I too like C pointers
                  References are also heavily used in C++ in the way you describe, but I meant more like people who are unable to even understand pointers, not just passing objects by reference to avoid copies. Those people, IMO, have no business in coding software. But that's just my opinion, and it doesn't *necessarily* make a language objectively bad (I was just using his own logic there). Lacking pointers, however, does make it worse, since it's well... lacking something essential.

                  Unfortunately, while I like C++ personally (heresy!), it's quite convoluted in this aspect; all the move semantics complications and the syntax (especially the && when used in templates) can make most people's head spin and I can't blame them. C keeps it simple (though I wish it had the constexpr capabilities of C++, just not the template complexity)
                  Last edited by Weasel; 06-05-2018, 06:16 PM.

                  Comment


                  • The old desktops you mentioned before are faster than contemponary ones, since they do less - like Gtk1 is faster than Gtk2. The former lacks unicode support entirely.


                    C is nice, but it lacks abstraction and expressability. It's pure luck, if your recursive functions get tail call optimized. And don't get me started about the nightmare called preprocessor. Sane programming languages have macro support that operate on the language's AST. Not a tool that does text substitution resulting in code, that possibly never gets syntax checked.

                    I,however, share your concerns about garbage collection. Although it's good enough in most cases, if the garbage collector is well optimized. Funny side note: GC is usually faster than using C++'s shared pointers.


                    Luckily, there are alternatives to C, which don't have the beforesaid problems. While I used to code C and do C++ programming at work, I wouldn't start anything new using those two languages.

                    Comment


                    • Originally posted by Weasel View Post
                      I'm just using your logic here: my preferences makes other languages inherently bad right?
                      I'm not talking about preferences, so no.

                      Comment

                      Working...
                      X