I was recently reading Tracy Kidder’s excellent book Soul of a New Machine.
The author pointed out what a big deal the transition to 32-Bit computing was.
However, in the last 20 years, I don’t really remember a big fuss being made of most computers going to 64-bit as a de facto standard. Why is this?
Point is that seldom used instructions are microcoded anyway, so they take zero space on the CPU.
I honestly don’t know when you are talking about. CPUs aren’t storage devices.