Entropy is a Number

Ludwig Boltzmann

Several years ago, in 2003 to be exact,  I came across the following interesting piece of prose:

The Second Law of Thermodynamics states that “in all energy exchanges, if no energy enters or leaves the system, the potential energy of the state will always be less than that of the initial state.” This is also commonly referred to as entropy. A watchspring-driven watch will run until the potential energy in the spring is converted, and not again until energy is reapplied to the spring to rewind it. A car that has run out of gas will not run again until you walk 10 miles to a gas station and refuel the car. Once the potential energy locked in carbohydrates is converted into kinetic energy (energy in use or motion), the organism will get no more until energy is input again. In the process of energy transfer, some energy will dissipate as heat. Entropy is a measure of disorder: cells are NOT disordered and so have low entropy. The flow of energy maintains order and life. Entropy wins when organisms cease to take in energy and die.

This is actually still taught at some college, check it for yourself at gened.emc.maricopa.edu. I did not dare to take a look at any of  the other chapters of the book that is used for their biology classes. No sentence in this short paragraph is correct, and in some of them you can’t even begin to count the errors. If you go to the chapter and read what the writer believes about potential and kinetic energy, it is a small miracle that he or she was not given an enormous amount of kinetic energy in the form of motion in a direction away from that institute of education. 

Entropy is not an energy, it is a number, a dimensionless quantity. It may not seem that way: if you look up values for the entropy, these are often given in units of J/K. But both 1 J and 1 K are quantities of energy, albeit on a different scale. If we had not been so stupid in the past to think of temperature as lines on a thermometer, we would not have needed Boltzmann’s constant. For those who know statistical mechanics this is obvious: the temperature is a measure for the average kinetic energy of a particle with mass \(m\):

\[\frac{1}{2}m\left<v^2\right> = \frac{3}{2}k_BT\]

where \(k_B = 1.3806503\times10^{-23}\) J/K. This constant therefore only connects two different scales for measuring energy. For that reason alone it is rather ridiculous to equate potential energy with entropy, the two have nothing to do with each other, they don’t even have the same dimension. It is truly comparing apples and pomegranates. The two different forms of energy mentioned, kinetic and potential, can be exchanged indefinitly without any involvement of entropy.

From a pure thermodynamics point of view it is not so obvious that entropy is a number, at least not at first. Carnot derived in 1824 that for each system you can find a quantity we now call \(S\) of which the changes \(\Delta S\) can be measured by measuring the amount of heat \(q\) entering the system at a fixed temperature \(T\) , and dividing \(q\) by that temperature. Carnot only had a vague idea of what heat was (he may even still have thought it was a conserved quantity) and the first law of thermodynamics was deduced much later, but we can use the first law to get the same insight:

\[dU = TdS + dw\]

This shows that \(U\) and \(TS\) must have the same dimension (Energy, J), and since we already found that \(T\) is also an energy, \(S\) must indeed be dimensionless. 

Leave a Reply

Your email address will not be published. Required fields are marked *