Announcement

Collapse
No announcement yet.

Intel Wants To Contribute Parallel STL Support To libstdc++ / libc++

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by d4ddi0 View Post
    I was very surprised in 2012 to find that not only was libstdc++ not parallelized, In places it wasn't even reentrant (and thus not safe to use in threaded code at all).
    Because for most of what was in libstdc++, it didn't make sense to embed either threading or locks. Most of that stuff belongs at a higher level. Threading (still) isn't something one does naively (and expects any good results).

    Originally posted by d4ddi0 View Post
    No need to worry here about hurting another vendor.
    Yes, there is need. In their closed-source libraries & tools, Intel has a history of shunting off to a slow-path, unless an Intel-branded CPU was detected. This made their CPUs look artificially fast and hurt anyone running AMD (or other). I'm glad to hear this is vendor neutral, but then it'd be harder to get away with that nonsense in opensource.

    Comment


    • #12
      Originally posted by davidbepo View Post
      YES YES!!
      for me its the only useful feature of c++17 and that its not supported by the compilers is sad
      ;-)

      The initializer lists for containers are pretty nice too, even if they are just syntactic sugar.

      Comment


      • #13
        Originally posted by hoohoo View Post
        The initializer lists for containers are pretty nice too, even if they are just syntactic sugar.
        How is that different from brace-initialization, in C++11?

        Edit: it seems they just added some new rules around it:

        Last edited by coder; 02 December 2017, 08:10 PM.

        Comment


        • #14
          Originally posted by coder View Post
          How is that different from brace-initialization, in C++11?

          Edit: it seems they just added some new rules around it:

          http://mariusbancila.ro/blog/2017/04...ced-init-list/
          To be honest I am not always sure which new feature came from which revision (0x, '11, '17. Isn't there a '13 too or did I imagine that one?). For 20-odd years C++ was the unchanging firmament upon which the universe rested. Then all these changes came.

          When, I wonder, will the kitchen sink be added?

          Comment


          • #15
            Originally posted by hoohoo View Post
            To be honest I am not always sure which new feature came from which revision (0x, '11, '17. Isn't there a '13 too or did I imagine that one?). For 20-odd years C++ was the unchanging firmament upon which the universe rested. Then all these changes came.
            Many references have tagged in which versions certain features were added (and deprecated). For instance:



            Originally posted by hoohoo View Post
            When, I wonder, will the kitchen sink be added?
            I hear you. I'm also troubled by the new rate of change. Their stated policy seems to be updating the standard every 3 years, with most big changes introduced on the odd-numbered updates. So, C++11 and C++17 are the big updates, while C++14 and C++20 are like maintenance updates.

            I think 11 and 14 made a lot of sense. In fact, many of those changes were overdue. I wish they could've just held off any more big changes until at least C++20. I work on a variety of platforms, including an embedded platform and an enterprise platform with lagging compilers. It's already caused me some trouble when open source libraries I've wanted to use had eagerly embraced C++14 and C++17 features not supported by some of these platforms' compilers.

            Beyond that, I just have trouble recommending C++. I've made a career of using it, but it's gotten so big and unwieldy that I'm genuinely concerned for its future. We've seen this movie before, with languages like ADA and Common Lisp.

            Comment

            Working...
            X