In any “discussion” concerning the “merits” of degrees Celsius versus degrees Fahrenheit, there’s an even chance someone will joke that we should all be using kelvins instead. Well I’m that guy, but without joking. Two principle reasons: it makes more sense, and nothing useful is lost.

## more sense

There’s a lot of talk over how Celsius’s basis in water is more useful, how negative means freezing, how Fahrenheit’s zero is very roughly the coldest weather that occurs. But both zeroes are equally arbitrary. Anders Celsius used water; Daniel Fahrenheit used a mixture of ice, water and ammonium chloride for whatever reason.

But neither of these arbitrary points are at all close to zero temperature. The thermodynamic temperature of an object continues to reduce linearly, with no changes in behaviour, below these points.

There is a lower limit for temperature, and the Kelvin scale uses it. And that’s it, really. Imagine if rulers had their zero point some particular distance along them, such that the lower end of the scale were always some specific negative value. Why? Why would you use that?

An absolute scale is the only sensible choice for dealing with temperatures. What’s the efficiency of a Carnot engine operating between 25 ℃ and 925 ℃? Beats me. What’s the efficiency of a Carnot engine operating between 300 K and 1200 K? Well that’s just 75%.

Most people don’t need to determine these sorts of things. Hence, the second part: the relative scales have no useful merit of their own.

## nothing is lost

Discussion of temperature scales tends to focus around weather because that’s all temperature scales are usually used for. Fortunately, the Kelvin scale actually does have some decent applicability here too: 300 K (26.85 ℃) serves as a nice reference point for temperate temperature that verges on warm. Anything above 300 K is starting to get hot, and cooler than that is progressively fine, chilly, then freezing at 270 K.

In any other situation, the magnitude of numbers was arbitrary anyway. In one world, you pre-heat your oven to 220 ℃. In another, you pre-heat to 490 K. (in the third world, you use 430 ℉.) Nothing is lost with the Kelvin scale. And now, if you want to know the pressure of an ideal gas filling that oven, or how efficiently it could drive a heat engine, there are no awkward conversions involved. Handy, huh…

Another benefit presents itself: most people don’t have the degree symbol on their keyboard. They certainly don’t have a button for “℃” (that’s one character). But anyone can type a space followed by the letter “K”.

## note on terminology

Some people (a fair few, actually) insist that such temperatures are spoken as “300 Kelvin”, and will complain if you say “kelvins”. I don’t know why. There’s no style guide advocating that.

Kelvins are an SI unit, like joules and metres. That means they form plurals just fine, and the full name of the unit is not capitalised.

The whole point is that it is an absolute scale, so you really can speak of temperatures as a proper quantity, rather than a relative difference from some arbitrary reference point.