Announcement

Collapse
No announcement yet.

Another Year Closer, Linux 4.21 Getting More Preparations For Y2038 Problem

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • ssokolow
    replied
    Originally posted by starshipeleven View Post
    Using some arbitrary time they said Jeezus was born on worked fine for 2 millenia for most of the western world.

    Informatics has their own Jeezus apparently. What was born in Jan 1 1970?
    It's the "nice round number" version of UNIX's birthdate.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by AsuMagic View Post
    Don't worry, kernel devs: By that time, Google will have enslaved humanity and we'll all be running Fuschia.
    I welcome our new purple-dressed overlord. My body is ready.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by xinthose View Post
    Isn't there a better way to tell time than using a 1970 epoch?
    Using some arbitrary time they said Jeezus was born on worked fine for 2 millenia for most of the western world.

    Informatics has their own Jeezus apparently. What was born in Jan 1 1970?

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by Kemosabe View Post
    Jokes aside, it it actually worth a thought if the computing landscape can change that dramatically within the next 20 years to hypothesize that Linux and friends might not play a relevant role at this time anymore. . .
    Can it?

    Current landscape is pretty different from what it was 20 years ago, most markets are saturated, hardware does not increase in power much over many years...

    Leave a comment:


  • jabl
    replied
    Originally posted by xinthose View Post
    So instead of `int` it will now be `unsigned long long int`? Isn't there a better way to tell time than using a 1970 epoch?
    It will be a signed long long int, not unsigned. As others mentioned, it's somewhat arbitrary which start point one uses, and which resolution. Seconds since the 1970 epoch has the big advantage that it's compatible with POSIX, and what 64-bit systems already do.

    Leave a comment:


  • jabl
    replied
    Originally posted by Mavman View Post
    So, let me get this strait:

    IF we're all running 64 bit machines, there is no problem?
    On 64-bit Linux, time_t has always been 64-bit, so the kernel interfaces are Ok (I guess, unless there's some ioctl's or something like that which still use a 32-bit timestamp?). There's still things like file formats, file system on-disk formats, network protocols etc. that store time in a y2038-problematic fashion, regardless of the bitness of the kernel.

    Leave a comment:


  • Mavman
    replied
    So, let me get this strait:

    IF we're all running 64 bit machines, there is no problem?

    Leave a comment:


  • zboszor
    replied
    Originally posted by AsuMagic View Post
    Don't worry, kernel devs: By that time, Google will have enslaved humanity and we'll all be running Fuschia.
    I don't want to be a pimp exploiting a demented young lady from Mervin Peake's Ghormenghast...

    Leave a comment:


  • Delgarde
    replied
    Originally posted by xinthose View Post
    Isn't there a better way to tell time than using a 1970 epoch?
    Honestly, not really. Time measurement is completely arbitrary anyway, so if you're going to have a "zero" point, 1 Jan 1970 is as good a choice as any. And once you've established that, it's just a matter of deciding what you're counting... whether it's days, seconds, nanoseconds, whatever.

    The real headaches in telling time are around converting that count into traditional "human-friendly" measures... accounting for such wonderful human inventions as leap years, timezones, and daylight savings.

    Leave a comment:


  • Zan Lynx
    replied
    Originally posted by xinthose View Post
    So instead of `int` it will now be `unsigned long long int`? Isn't there a better way to tell time than using a 1970 epoch?
    Sure. A 128-bit count of Planck Time (10⁻⁴¹ seconds) since the Big Bang. Perfection!

    But wait, time is meaningless except in a single reference frame.

    So, no, time measurement pretty much sucks everywhere.

    Leave a comment:

Working...
X