No announcement yet.

Rust Now Prefers Using The GNU Gold Linker By Default

  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Luke_Wolf View Post

    no it doesn't you moron.
    And that, kids, is how you make your point. /s


    • #12
      Originally posted by Ericg View Post

      Oi, Luke, cool it. Mods do exist on these forums, as infrequent as they are.
      My patience with anti-OOP fools has run rather much threadbare to the point, where I'm done being nice to people like PinkUnicorn there who brigade in with disingenuous argument copied off of C programmers who don't understand OOP, which almost always rely upon things total newbs do with the class system that don't really have anything to do with Object Oriented Programming. At least the cult of The Unix Philosophy guys while falling to similar levels of disingenuity aren't quite that asinine.


      • #13
        Originally posted by Luke_Wolf View Post

        My patience with anti-OOP fools has run rather much threadbare...
        I did not mean to disguise myself as someone with the C/Unix mindset, I just happen to see/hear (just like you) many remarks and examples of misuse on programming paradigms/concepts/idioms/etc. As for the number of members, Ogre was known to be a monster (nomen est omen), and that any node inside the node graph was a class with 100+ members. I could look for the article, but it became the de-facto example of cache optimization, how rearranging the order of the members resulted in a 2-digit percentile speed-up in global.

        I'm not saying Ogre is useless or anything, but it was not crafted with such things in mind. What I really wanted to say was, that Ogre (rest assured there are a myriad of other cases) that have reached a complexity where OOP starts becoming inefficient. Properly identifying roles and responsibilities (let's say in an STL-like manner, where allocators are parameters of containers, and iterators are responsible for access, etc.) you can reduce the number of members at the cost of composing them into entities that implement some feature as a union. At best, your class hierarchies will be balanced and every class will have roughly the same number of members. However, as you are reaching higher and higher levels, your objects grow in size uncontrollably, even if in the given scenario you know you will be using only a very small subset of the classes capabilities (members). In GPGPU and at a very low level, this effect is demonstrated through AoS vs. SoA, however outside data-parallel algorithms it would be more adequate to call it DOD.

        Creating thin views that group some entities together for the lifetime an operation without having touch the entire object and load it into cache is something that is not possible, or is simply very hard to achieve with a classic OOP mindset.

        I did not mean to start a flame war, just wanted to make a point, which seems I did not back up sufficiently.

        Originally posted by Luke_Wolf View Post

        Proper OOP dictates that you should use no more than 5 levels of derivation
        Do you have any good source where such guidance of "canonical" OOP design is written? I am an HPC programmer/physicst, not a software engineer, so when it comes to programming, I am dominantly self-taught and care for low-level implementation and simple things, not high-level design.