Announcement

Collapse
No announcement yet.

Glibc's Slow Turnaround For Y2038 Fixes Is Frustrating

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by andreano View Post
    Another solution (than adding bits) is to
    rewrite every app. i wonder why nobody is pursuing such approach

    Comment


    • #32
      Originally posted by atomsymbol
      This is primarily caused by C not having native support for arbitrary precision integers from the start (from year 1972).
      Well, this is primarily caused by CPU not having native support for arbitrary precision integers from the start.

      Comment


      • #33
        Originally posted by pal666 View Post
        i hope nobody is designing something new on 32 bit platform
        Embedded developers tend to use hardware that match their need, and 64 bits fast processors or more often that not unneeded. They are more expensive, requires more expensive companion components, and so on. Not every device need a 2Ghz, 64 bits processor and 4GB of DDR4 RAM.

        Comment


        • #34
          The arguments he gives are not on a technical level; instead he says that only a native English speaker who has a with deep understanding of glibc internals and the previous development process will be capable to provide such a patch in an acceptable way.
          What a racist bigot.
          Last edited by rmoog; 16 July 2019, 06:36 AM. Reason: language

          Comment


          • #35
            Originally posted by Emmanuel Deloget View Post

            Well, this is primarily caused by CPU not having native support for arbitrary precision integers from the start.
            And indexes or timestamp over 64-bit being a silly idea. You don't need an indexes larger than the number of atoms in the universe, or timestamps that extends millions of times longer than the lifetime of the universe.

            Comment


            • #36
              Originally posted by Emmanuel Deloget View Post

              Well, this is primarily caused by CPU not having native support for arbitrary precision integers from the start.
              Well, good luck designing a CPU with builtin malloc or which can magically extends the size of registers or memory adresses at will

              Btw. You can easily simulate arbitrary precision integers if you want to, and earlier CPUs DID certainly have that capability. The problem is in how slow they are due to them requiring dynamical allocated memory, and of how little use they are..

              Comment


              • #37
                Originally posted by pal666 View Post
                glibc didn't exist in 1972, thus it's irrelevant
                But, if we let the clock roll over... then maybe it did?

                Comment


                • #38
                  Originally posted by cjcox View Post
                  But, if we let the clock roll over... then maybe it did?
                  i think in that case it will

                  Comment


                  • #39
                    Originally posted by rmoog View Post
                    What a racist bigot.
                    it's unclear how native english speach relates to race

                    Comment

                    Working...
                    X