Alt text:
It’s not just time zones and leap seconds. SI seconds on Earth are slower because of relativity, so there are time standards for space stuff (TCB, TGC) that use faster SI seconds than UTC/Unix time. T2 - T1 = [God doesn’t know and the Devil isn’t telling.]
Obligatory “Falsehoods programmers believe about time”
Thank you, but I gave up halfway through the list.
I got to “The day before Saturday is always Friday” and I was like waaaa?
I thought it is about when Julian calendar was dropped in favour of Gregorian, but that’s not it:
Also some of the islands around the International Date Line did switch their stance on which side of the Date Line they are. So… they might have had a day twice or lost a whole day in the process. And maybe, they didn’t change sides only once…
E.g. see here https://youtu.be/cpKuBlvef6A
A great video you linked, the missing Friday is in it on timestamp 22:45
The Thursday 29th of December 2011 was followed by Saturday 31st of December 2011 on Samoa
Epoch is your friend, or use UTC. At least that’s my layman reasoning. I have no challenges working with DateTime except when I don’t know the underlying conditions applied from the source code.
I really wish that list would include some explanations about why each line is a falsehood, and what’s actually true. Particularly the line:
If the author has proof that some software will run on a space ship that is orbiting a black hole, I’d be really interested in seeing it.
Technically isn’t the Earth itself a sort of space ship which is orbiting (…a star which is orbiting…) the black hole at the center of the Milky Way galaxy? Not really close enough for time dilation to be a factor, but still.
All links to the original article are dead and even archive.org doesn’t have a capture either. I guess the argument is along the lines of “it might not be relevant, when you’re scripting away some tasks for your small personal projects, but when you’re working on a widely used library or tool - one day, it might end up on a space vessel to explore whatever.”
E.g. my personal backup script? Unlikely. The Linux kernel? Somewhat plausible.
It’s a programmer thing. As you’re typing the code, you may suddenly realize that the program needs to a assume certain things to work properly. You could assume that time runs at a normal rate as opposed to something completely wild when traveling close to the speed of light or when orbiting a black hole.
In order to keep the already way too messy code reasonably simple, you decide that the program assumes you’re on Earth. You leave a comment in the relevant part of the code saying that this part shouldn’t break as long as you’re not doing anything too extreme.
Well in a very strict sense one can’t really say “never” (unless you can see into the Future), but it’s probably safe to go along with “It’s highly unlikelly and if it does happen I’ll fix it or will be long dead so won’t care”.
This one is good (or evil, depends on how you see it):
That one’s really good.
Which one is it?
And is it 2011/2005 or rather 1911/1905, 1811/1805,…?
Does anyone know what is untrue about “Unix time is the number of seconds since Jan 1st 1970.”?
When a leap second happens, unix time decreases by one second. See the section about leap seconds here: https://en.m.wikipedia.org/wiki/Unix_time
As a side effect, this means some unix timestamps are ambiguous, because the timestamps at the beginning and the end of a leap second are the same.
It might be more accurate to say that Unix time is the number of days since Jan 1st, 1970, scaled by 24×60×60. Though it gets a bit odd around the actual leap second since they aren’t spread over the whole day. (In some ways that would be a more reasonable way to handle it; rather than repeating a second at midnight, just make all the seconds slightly longer that day.)
This post made my head hurt
This one broke networking on Windows 3.1, if people bought AMD. The default softmodem driver estimated clock speed using two timestamps separated by a busy loop of some thousand additions. Speed = 1000 / ( T2 - T1 ). If your CPU was too fast you’d get a division-by-zero error.
The surprise was that it did not happen on Intel machines. Not even if they were clocked faster. Which they often were, because “just go faster” was Intel’s central tactic for about twenty years. AMD remained competitive by focusing on design improvements… like reducing how many clock cycles addition took.