Celsius uses an arbitrary reference point (freezing point of water). Kelvin uses the same sized units, but is referenced from absolute zero. While this seems just as arbitrary, it actually makes some scientific calculations a lot easier.
Basically, scientists have been working to slot the various base units together in a neat and orderly manner. Kelvin fits this far better than Celsius, and so became the baseline SI unit.
Some people seem to have this misconception that “0F cold 100F hot” is somehow an innate or intuitive concept for everyone. It’s not, brother, you just happen to be used to it. I have absolutely no idea if I should wear a coat with 62F or not, or for any other F temperature for that matter.
At least 0C and 100C have very practical references that anyone can recognise, but what the hell even is 0F and 100F?
Also, not sure why you’re trying to shoehorn 0-100F to 0-100C.
When talking about weather, it’s going to be in a range like 0C (cold) / 20C (nice) / 40C (hot), which is equally arbitrary but probably more useful than 0F/50F/100F anyway depending on where you live: my neck of the woods goes to 0C in a harsh winter, and to 40C in the peak of summer.
And do you use F for stuff like cooking? What purpose is 0F or 100F there?
How about stuff like chemistry or physics? I remember formulas in C or K, occasionally having to add 273.5. Is F used, or you just use K/C and convert at the start?
Celsius uses an arbitrary reference point (freezing point of water). Kelvin uses the same sized units, but is referenced from absolute zero. While this seems just as arbitrary, it actually makes some scientific calculations a lot easier.
Basically, scientists have been working to slot the various base units together in a neat and orderly manner. Kelvin fits this far better than Celsius, and so became the baseline SI unit.
Yep! Celsius does make sense for our everyday life
I fully agree with that. It’s also quite easy to shift between the 2. I just had the difference drilled into me way too much, at university.
Removed by mod
Some people seem to have this misconception that “0F cold 100F hot” is somehow an innate or intuitive concept for everyone. It’s not, brother, you just happen to be used to it. I have absolutely no idea if I should wear a coat with 62F or not, or for any other F temperature for that matter.
At least 0C and 100C have very practical references that anyone can recognise, but what the hell even is 0F and 100F?
Also, not sure why you’re trying to shoehorn 0-100F to 0-100C.
When talking about weather, it’s going to be in a range like 0C (cold) / 20C (nice) / 40C (hot), which is equally arbitrary but probably more useful than 0F/50F/100F anyway depending on where you live: my neck of the woods goes to 0C in a harsh winter, and to 40C in the peak of summer.
And do you use F for stuff like cooking? What purpose is 0F or 100F there?
How about stuff like chemistry or physics? I remember formulas in C or K, occasionally having to add 273.5. Is F used, or you just use K/C and convert at the start?