“What makes the world go round are not sources of energy, but sources of low entropy. Without low entropy, energy would dilute into uniform heat and the world would go to sleep in a state of thermal equilibrium – there would no longer be any distinction between past and future and nothing would happen” – Carlo Rovelli

Introduction

A hot cup of coffee in snow will always cool down. The First Law of Thermodynamics states that energy cannot be created or destroyed, only converted from one form to another, however, simple observation reveals this law alone does not describe the complete situation, otherwise we would see the cup of coffee warm up and the snow cool down in some instances. The term entropy has been assigned various meanings in a variety of contexts, from steam engines to information theory. It is a law that seems so intuitive that we take it for granted, despite its profound implications for systems such as the universe.

Beginnnings

The history of the Second Law of Thermodynamics begins in the early 19th Century, before even the First Law of Thermodynamics had been shown. As an engineer, Sadi Carnot observed that heat engines of his day were remarkably inefficient: ‘If, some day, the steam-engine shall be so perfected that it can be set up and supplied with fuel at small cost, it will combine all desirable qualities, and will afford to the industrial arts a range the extent of which can scarcely be predicted.’ Consequently, Carnot set about investigating whether these engines’ efficiency could be improved. Steam engines incur mechanical power by using heat to manipulate the temperatures and, therefore, pressures of gases, a similar process to modern-day steam turbines; by transferring heat, work can be done. In his book, Reflections on the Motive Power of Fire, Carnot established the Carnot Cycle. It described the physical cycle of compression and expansion of the gases that ultimately drove the engine, and allowed for the calculation of the useful work done. It became clear that one ratio had importance to the process:

Q1/T1 = Q2/T2

Where Q is the heat and T is the temperature of the respective heat reservoirs within the engine, the first being the hot and the second being the cold.

Given that the heat, Q, describes thermal energy, the work done is equal to the difference in heats of the two reservoirs, due to the conservation of energy, the First Law of Thermodynamics. While using the First Law is certainly a simpler explanation, it is still possible to demonstrate this effect without the First Law, such as with the subtle reasoning employed by Carnot himself.

W = Q1 – Q2

By substitution:

W/Q1 = (T1 – T2)/T1

Recall that:

Efficiency = Useful Work Done ÷ Total Input Energy

Notice that W/Q1 describes exactly this, and therefore:

Efficiency = (T1 – T2)/T1

At this point, it should be said that these temperatures are measured in Kelvins, and therefore for there to be 100% efficiency, T2, the temperature of the cold reservoir, would have to be 0K; an impossibility. There is therefore some inherent energy incapable of doing work.

Birth

The story of entropy is resumed 40 years after the work of Sadi Carnot by Rudolph Clausius who investigated this inherent incapability of doing work. The Carnot Cycle dealt with reversible processes, stating that the ratio of heat to temperature is the same within the various states of this process. Clausius built upon the work of Carnot, defining entropy to be this ratio of heat to temperature, which conveys a measure of how much energy is unavailable to do work, stating that the total change in entropy for a reversible process in a closed system is 0.

Real processes, however, are never perfectly reversible. As Feynman puts it, if you drop a cup and it breaks, you can wait a long time for the pieces to come back together, but they never will. It is this idea that led Clausius to state that ‘heat will not pass spontaneously from a colder to a hotter body’. If you did work on the system, you could cause heat to flow from cold to hot, such as with a refrigerator, but this would no longer be a closed system. This leads us to the conclusion that the entropy of a system never decreases, and only in a perfectly reversible process will there be conservation of entropy.

Disorder

Despite a birthplace in the mechanics of heat, entropy can equally describe the disorder of a system. Entropy relates the lack of ability of a system to do work, which is ultimately linked to a system being ordered or not. 

This can be thought of as a simple turbine. The hot air on the left is allowed to mix with the cold air on the left, the flow of air could turn a turbine in the middle and output useful work. Before the two compartments are allowed to mix, the system is ordered: particles with higher thermal energy on the left, lower on the right. If this is a closed system, then eventually it reaches thermal equilibrium and no more work can be done. Recall that entropy told us about the ability of a system to do work: since the system on the right can no longer do any work, and therefore has maximal entropy. In fact, an ordered system is potential energy in disguise. For example, a solid held by bonds is very ordered, which, when broken, releases energy and becomes disordered. However, when there is only disorder, there is no potential energy and therefore no ability to do work on something.

Heat Death

It is important to recall the conclusion that Clausius came to: the entropy of a closed system always increased. In order to decrease entropy one would have to do work on that system, meaning it would no longer be closed. A room, for example, will become disordered. The only way to make this room ordered again, would be to do work on it, by tidying up. To our knowledge, the universe is a closed system in which time only flows in one direction. This leads us to the conclusion that the entropy of the universe itself will eventually be maximised and reach thermal equilibrium: nothing interesting will ever happen again. This is the so-called ‘heat death of the universe’. Considering the humble beginnings in heat engines, the conclusion of the universe ‘dying’ is a profound one.

Bibliography

Rovelli, C., Segre, E., & Carnell, S. (2019). The Order of Time. UK: Allen Lane.

Feynman, R. P., Leighton, R. B., & Sands, M. L. (2011). Volume 1: The Laws of Thermodynamics. In The Feynman lectures on physics. San Francisco, CA: Addison-Wesley.

Feynman, R. (2018, July 11). Richard Feynman’s Lecture: Entropy (Part 01). Retrieved from https://www.youtube.com/watch?v=ROrovyJXSnM

Published By : Epicurus Of Albion. (2017, February 24). What is Entropy? Retrieved from https://www.thestandupphilosophers.co.uk/what-is-entropy/

Carnot, S. (n.d.). Reflections on the Motive Power of Fire. Retrieved from https://www.pitt.edu/~jdnorton/teaching/2559_Therm_Stat_Mech/docs/Carnot%20Reflections%201897%20facsimile.pdf

Libretexts. (2020, August 25). 7.2: Heat. Retrieved February 01, 2021, from https://chem.libretexts.org/Bookshelves/General_Chemistry/Map%3A_General_Chemistry_(Petrucci_et_al.)/07%3A_Thermochemistry/7.2%3A_Heat

OpenStax. (n.d.). Entropy and the Second Law of Thermodynamics: Disorder and the Unavailability of Energy. Retrieved February 01, 2021, from https://courses.lumenlearning.com/physics/chapter/15-6-entropy-and-the-second-law-of-thermodynamics-disorder-and-the-unavailability-of-energy/


About the author

Louis Robson