https://xkcd.com/2867

Alt text:

It’s not just time zones and leap seconds. SI seconds on Earth are slower because of relativity, so there are time standards for space stuff (TCB, TGC) that use faster SI seconds than UTC/Unix time. T2 - T1 = [God doesn’t know and the Devil isn’t telling.]

      • kurwa@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        11 months ago

        I got to “The day before Saturday is always Friday” and I was like waaaa?

        • lad@programming.dev
          link
          fedilink
          English
          arrow-up
          6
          ·
          11 months ago

          I thought it is about when Julian calendar was dropped in favour of Gregorian, but that’s not it:

          Thursday 4 October 1582 was followed by Friday 15 October 1582

          • elvith@feddit.de
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            11 months ago

            Also some of the islands around the International Date Line did switch their stance on which side of the Date Line they are. So… they might have had a day twice or lost a whole day in the process. And maybe, they didn’t change sides only once…

            E.g. see here https://youtu.be/cpKuBlvef6A

            • lad@programming.dev
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 months ago

              A great video you linked, the missing Friday is in it on timestamp 22:45

              The Thursday 29th of December 2011 was followed by Saturday 31st of December 2011 on Samoa

      • whoisearth@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        11 months ago

        Epoch is your friend, or use UTC. At least that’s my layman reasoning. I have no challenges working with DateTime except when I don’t know the underlying conditions applied from the source code.

    • randy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      10
      ·
      11 months ago

      I really wish that list would include some explanations about why each line is a falsehood, and what’s actually true. Particularly the line:

      The software will never run on a space ship that is orbiting a black hole.

      If the author has proof that some software will run on a space ship that is orbiting a black hole, I’d be really interested in seeing it.

      • nybble41@programming.dev
        link
        fedilink
        English
        arrow-up
        10
        ·
        11 months ago

        Technically isn’t the Earth itself a sort of space ship which is orbiting (…a star which is orbiting…) the black hole at the center of the Milky Way galaxy? Not really close enough for time dilation to be a factor, but still.

      • elvith@feddit.de
        link
        fedilink
        English
        arrow-up
        6
        ·
        11 months ago

        All links to the original article are dead and even archive.org doesn’t have a capture either. I guess the argument is along the lines of “it might not be relevant, when you’re scripting away some tasks for your small personal projects, but when you’re working on a widely used library or tool - one day, it might end up on a space vessel to explore whatever.”

        E.g. my personal backup script? Unlikely. The Linux kernel? Somewhat plausible.

      • Hamartiogonic@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        It’s a programmer thing. As you’re typing the code, you may suddenly realize that the program needs to a assume certain things to work properly. You could assume that time runs at a normal rate as opposed to something completely wild when traveling close to the speed of light or when orbiting a black hole.

        In order to keep the already way too messy code reasonably simple, you decide that the program assumes you’re on Earth. You leave a comment in the relevant part of the code saying that this part shouldn’t break as long as you’re not doing anything too extreme.

      • Aceticon@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        11 months ago

        Well in a very strict sense one can’t really say “never” (unless you can see into the Future), but it’s probably safe to go along with “It’s highly unlikelly and if it does happen I’ll fix it or will be long dead so won’t care”.

    • lad@programming.dev
      link
      fedilink
      English
      arrow-up
      8
      ·
      11 months ago

      This one is good (or evil, depends on how you see it):

      Human-readable dates can be specified in universally understood formats such as 05/07/11.

      • elvith@feddit.de
        link
        fedilink
        English
        arrow-up
        7
        ·
        11 months ago

        That one’s really good.

        Which one is it?

        • July 5th 2011
        • May 7th 2011
        • July 11th 2005
        • November 7th 2005

        And is it 2011/2005 or rather 1911/1905, 1811/1805,…?

    • Kethal@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      11 months ago

      Does anyone know what is untrue about “Unix time is the number of seconds since Jan 1st 1970.”?

      • icydefiance@lemm.ee
        link
        fedilink
        English
        arrow-up
        26
        ·
        edit-2
        11 months ago

        When a leap second happens, unix time decreases by one second. See the section about leap seconds here: https://en.m.wikipedia.org/wiki/Unix_time

        As a side effect, this means some unix timestamps are ambiguous, because the timestamps at the beginning and the end of a leap second are the same.

        • nybble41@programming.dev
          link
          fedilink
          English
          arrow-up
          3
          ·
          11 months ago

          It might be more accurate to say that Unix time is the number of days since Jan 1st, 1970, scaled by 24×60×60. Though it gets a bit odd around the actual leap second since they aren’t spread over the whole day. (In some ways that would be a more reasonable way to handle it; rather than repeating a second at midnight, just make all the seconds slightly longer that day.)

    • mindbleach
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      A time stamp of sufficient precision can safely be considered unique.

      This one broke networking on Windows 3.1, if people bought AMD. The default softmodem driver estimated clock speed using two timestamps separated by a busy loop of some thousand additions. Speed = 1000 / ( T2 - T1 ). If your CPU was too fast you’d get a division-by-zero error.

      The surprise was that it did not happen on Intel machines. Not even if they were clocked faster. Which they often were, because “just go faster” was Intel’s central tactic for about twenty years. AMD remained competitive by focusing on design improvements… like reducing how many clock cycles addition took.