That’s easy. The 2038 problem is fixed by using 64-bit processors running 64-bit applications. Just about everything built in the last 15 years has already got the fix
Using that fix, the problem doesn’t come up again for about 300 billion years
You don’t need 64 bit programs or CPUs to fix the 2038 problem. You just need to use a 64 bit time_t. It will work fine on 32 bit CPUs or even 8 bit microcontrollers.
Linux kernel updated to 64 bit time quite recently. In 2038 I can guarantee somebody in a very serious business is still using an ancient RHEL and will have issues.
With people turning back system time to use their Trial Software forever all the time, already causing problems with GitLab history, I feel like not many will blink at that.
Not really processor based. The timestamp needs to be ulong (not advised but good for date ranges up to something like 2100, but cannot express dates before 1970). Or llong (long long). I think it’s a bad idea but I bet some people too lazy to change their database schema will just do this internally.
The type time_t in Linux is now 64bit regardless. So, compiling applications that used that will be fine. Of course it’s a problem if the database is storing 32bit signed integers. The type on the database can be changed too and this isn’t hard really.
As for the Y10K problem. It will almost entirely only be formatting problems I think. In the 80s and 90s, storage was at a premium, databases were generally much simpler and as such dates were very often stored as YYMMDD. There also wasn’t so much use of standard libraries. So this meant that to fix the Y2K problem required quite some work. In some cases there wasn’t time to make a proper solution. Where I was working there was a two step solution.
One team made the interim change to adjust where all dates were read and evaluate anything <30 (it wasn’t 30, it was another number but I forget which) to be 2000+number and anything else 1900+number. This meant the existing product would be fine for another 30 years or so.
The other team was writing the new version of the software, which used MSSQL server as a back-end, with proper datetime typed columns and worked properly with years before and after 2000.
I suspect this wasn’t unusual in terms of approach and most software is using some form of epoch datatype which should be fine in terms of storing, reading and writing dates beyond Y10K. But some hard-coded date format strings will need to be changed.
I don’t know how many people have used an int to store time, but I can guarantee you it’s more than you’d expect. And then there are various interpreted languages which often depend on the build environment. For example php’s int is based on the architecture, meaning it’s 2147483647 on 32bits.
And let me remind you how Gangnam style broke YouTube’s view count by having more than 2147483647 views. Even if it might have been reasonable back than to use a 32 bit int, it should have been unsigned.
Shit will go wild in 2038. Nothing apocalypse level, but a lot of things will break. Much more than during Y2K because there’s simply many more computers now.
You’re right on every point. But, I’m not sure how that goes against what I said.
Most applications now use the epoch for date and time storage, and for the 2038 problem the issues will be down to making sure either tiime_t or 64bit long values (and matching storage) which will be a much smaller change then was the case for y2k. Since more people also use libraries for date and time handling it’s also likely this will be handled.
Most databases have datetime types which again are almost certainly already ready for 2038.
I just don’t think the scale is going to be close to the same.
Depends on your definition of scale, because in absolute numbers I think Y2K38 wins, even though it might be a lower percentage.
I think the main issue is not the services that are updated at least once a year, but those that run forgotten somewhere with a sticker “here be dragons” on the case.
Regardless of how many are affected, it’s gonna be fun for sure! Can’t wait for some public government and ad company screens to inevitably show certificate errors.
I think it’ll be a “we’ll see” situation. This was the main concern for y2k. And I don’t doubt there’s some stuff that was partially patched from y2k still around that is still using string dates.
But the vast majority of software now works with timestamps and of course some things will need work. But with y2k the vast majority of business software needed changing. I think in this case the vast majority will be working correctly already and it’ll be the job of developers (probably in a panic less than a year before as is the custom) too catch the few outliers and yes some will escape through the cracks. But that was also the case last time round too.
There are plenty of smaller devices still running 32-bit ARM processors. Enough of them in fact that Ubuntu 24.04 got entirely recompiled to only use 64-bit timestamps even on 32-bit platforms since by the time it’s out of support 2038 will be less than 5 years away.
That’s easy. The 2038 problem is fixed by using 64-bit processors running 64-bit applications. Just about everything built in the last 15 years has already got the fix
Using that fix, the problem doesn’t come up again for about 300 billion years
You don’t need 64 bit programs or CPUs to fix the 2038 problem. You just need to use a 64 bit time_t. It will work fine on 32 bit CPUs or even 8 bit microcontrollers.
True, that should have occurred to me. That’s what I get for not touching a compiler since the Christmas holidays started
And not using 32-bit integers to calculate time. Which is still a thing in many many many codebases written in C or C++…
32 bit embedded processors us a lot of 32 bit time, though i am not sure if date time libraries in SDKs have been updated to use 64 bit for time.
Linux kernel updated to 64 bit time quite recently. In 2038 I can guarantee somebody in a very serious business is still using an ancient RHEL and will have issues.
With people turning back system time to use their Trial Software forever all the time, already causing problems with GitLab history, I feel like not many will blink at that.
Not really processor based. The timestamp needs to be ulong (not advised but good for date ranges up to something like 2100, but cannot express dates before 1970). Or llong (long long). I think it’s a bad idea but I bet some people too lazy to change their database schema will just do this internally.
The type time_t in Linux is now 64bit regardless. So, compiling applications that used that will be fine. Of course it’s a problem if the database is storing 32bit signed integers. The type on the database can be changed too and this isn’t hard really.
As for the Y10K problem. It will almost entirely only be formatting problems I think. In the 80s and 90s, storage was at a premium, databases were generally much simpler and as such dates were very often stored as YYMMDD. There also wasn’t so much use of standard libraries. So this meant that to fix the Y2K problem required quite some work. In some cases there wasn’t time to make a proper solution. Where I was working there was a two step solution.
One team made the interim change to adjust where all dates were read and evaluate anything <30 (it wasn’t 30, it was another number but I forget which) to be 2000+number and anything else 1900+number. This meant the existing product would be fine for another 30 years or so.
The other team was writing the new version of the software, which used MSSQL server as a back-end, with proper datetime typed columns and worked properly with years before and after 2000.
I suspect this wasn’t unusual in terms of approach and most software is using some form of epoch datatype which should be fine in terms of storing, reading and writing dates beyond Y10K. But some hard-coded date format strings will need to be changed.
Source: I was there, 3000 years ago.
I don’t know how many people have used an int to store time, but I can guarantee you it’s more than you’d expect. And then there are various interpreted languages which often depend on the build environment. For example php’s int is based on the architecture, meaning it’s 2147483647 on 32bits.
And let me remind you how Gangnam style broke YouTube’s view count by having more than 2147483647 views. Even if it might have been reasonable back than to use a 32 bit int, it should have been unsigned.
Shit will go wild in 2038. Nothing apocalypse level, but a lot of things will break. Much more than during Y2K because there’s simply many more computers now.
You’re right on every point. But, I’m not sure how that goes against what I said.
Most applications now use the epoch for date and time storage, and for the 2038 problem the issues will be down to making sure either tiime_t or 64bit long values (and matching storage) which will be a much smaller change then was the case for y2k. Since more people also use libraries for date and time handling it’s also likely this will be handled.
Most databases have datetime types which again are almost certainly already ready for 2038.
I just don’t think the scale is going to be close to the same.
Depends on your definition of scale, because in absolute numbers I think Y2K38 wins, even though it might be a lower percentage.
I think the main issue is not the services that are updated at least once a year, but those that run forgotten somewhere with a sticker “here be dragons” on the case.
Regardless of how many are affected, it’s gonna be fun for sure! Can’t wait for some public government and ad company screens to inevitably show certificate errors.
I think it’ll be a “we’ll see” situation. This was the main concern for y2k. And I don’t doubt there’s some stuff that was partially patched from y2k still around that is still using string dates.
But the vast majority of software now works with timestamps and of course some things will need work. But with y2k the vast majority of business software needed changing. I think in this case the vast majority will be working correctly already and it’ll be the job of developers (probably in a panic less than a year before as is the custom) too catch the few outliers and yes some will escape through the cracks. But that was also the case last time round too.
There are plenty of smaller devices still running 32-bit ARM processors. Enough of them in fact that Ubuntu 24.04 got entirely recompiled to only use 64-bit timestamps even on 32-bit platforms since by the time it’s out of support 2038 will be less than 5 years away.
You mean regular PCs? Sure…
Less COTS stuff? Not necessarily.