Alt text:
It’s not just time zones and leap seconds. SI seconds on Earth are slower because of relativity, so there are time standards for space stuff (TCB, TGC) that use faster SI seconds than UTC/Unix time. T2 - T1 = [God doesn’t know and the Devil isn’t telling.]
This one broke networking on Windows 3.1, if people bought AMD. The default softmodem driver estimated clock speed using two timestamps separated by a busy loop of some thousand additions. Speed = 1000 / ( T2 - T1 ). If your CPU was too fast you’d get a division-by-zero error.
The surprise was that it did not happen on Intel machines. Not even if they were clocked faster. Which they often were, because “just go faster” was Intel’s central tactic for about twenty years. AMD remained competitive by focusing on design improvements… like reducing how many clock cycles addition took.