# Random Thoughts on Temperature and Intuition in Thermodynamics

Recent we felt a disturbance of the force: It has been demonstrated that the absolute temperature of a real system can be pushed to negative values.

The interesting underlying question is: What is temperature really? Temperature seems to be an intuitive everyday concept, yet the explanations of ‘negative temperatures’ prove that it is not.

Actually, atoms have not really been ‘chilled to negative temperatures’. I pick two explanations of this experiment that I found particularly helpful – and entertaining:

As Matt points out The issue is simply that formally temperature is a relationship between energy and entropy, and you can do some weird things to entropy and energy and get the formal definition of temperature to come out negative.

Aatish manages to convey the fact that temperature is inversely proportional to the slope of the entropy vs. energy curve using compelling analogs from economics. The trick is to find meaningful economic terms that are related in a way similar to the obscure physical properties you want to explain. MinutePhysics did something similar in explaining fundamental forces (cannot resist this digression):

I had once worked in laser physics, so Matt’s explanation involving two-level system speaks to me. His explanation avoids to touch on entropy and thus avoids to use the mysterious term entropy to explain mysterious temperature.

You can calculate the probabilities of population of these two states from temperature – or vice versa. If you manage to tweak the population by some science-fiction-like method (creating non equilibrium states) you can end up with a distribution that formally results in negative temperatures if you run the math backwards.

[In order to allow for tagging this post with Physics in a Nutshell I need to state that the nutshell part ends here.]

But how come that ‘temperature’ ever became such an abstract concept?

From a very pragmatic perspective focussed on macroscopic, everyday phenomena temperature is what we measure by thermometers, that is: calculated from the change of the volume of gases or liquids.

You do not need any explanation of what temperature or even entropy really is if you want to design efficient machines, such as turbines.

As a physics PhD working towards an MSc in energy engineering, I have found lectures in Engineering Thermodynamics eye-opening:

As a physicist I had been trained to focus on fundamental explanations: What is entropy really? How do we explain physical properties microscopically? That is: calculating statistical averages of the properties of zillions of gas molecules or imagining an abstract ‘hyperspace’ whose number of dimensions is proportional to the number of particles. The system as such moves through this abstract space as times passes by.

In engineering thermodynamics the question to What is entropy? was answered by: Consider it some property than can be calculated (and used to evaluate machines and processes).

Temperature Entropy diagram for steam. The red line represents a process called Rankine cycle: A turbine is delivering mechanical energy when temperature and pressure of the steam is decreased.

New terms in science have been introduced for fundamental conceptual reasons and/or because they came in handy in calculations. In my point of view, enthalpy belongs to the second class because it makes descriptions of gases and fluids flowing through apparatuses more straight-forward.

Entropy is different despite it can be reduced to its practical aspects. Entropy has been introduced in order to tame heat and irreversibility.

Richard Feynman stated (in Vol. I of his Physics Lectures, published 1963) that research in engineering contributed two times to the foundations of physics: The first time when Sadi Carnot formulated the Second Law of Thermodynamics  (which can be stated in terms of an ever increasing entropy) and the second time when Shannon founded information theory – using the term entropy in a new way. So musing about entropy and temperature – this is where hands-on engineering meets the secrets of the universe.

I tend to state that temperature had never been that understandable and familiar:

Investigations of the behavior of ideal gases (fortunately air, even moist air, is an ideal gas) have revealed that there needs to be an absolute zero temperature – when the volume of an ideal gas would approach zero.

When Clausius coined the term Entropy in 1865 1850 (*), he was searching for a function that allows to depict any process in a diagram such as the figure above, in a sense.
(*) Edit 1 – Jan. 31: Thanks to a true historian of science.

Heat is a vague term – it only exists ‘in transit’: Heat is exchanged, but you cannot assign a certain amount of heat to a state. Clausius searched for a function that could be used to denote one specific state in such a map of states, and he came up with a beautiful and simple relationship. The differential change in heat is equal to the change in entropy times the absolute temperature!  So temperature entered the mathematical formulations of the laws of thermodynamics when doing something really non-intuitive with differentials.

Entropy really seems to be the more fundamental property. You could actually start from the Second Law and define temperature in terms of the efficiency of perfect machines that are just limited by the fact that entropy can only increase (or that heat always needs to flow from the hotter to the colder object):

Stirling motor – converting heat drawn from a hot gas to mechanical energy. Its optimum efficiency would be similar to that of Carnot’s theoretical machine.

The more we learn about the microscopic underpinnings of the laws that have been introduced phenomenologically before, the less intuitive explanations became. It does not help trying to circumvent entropy by considering what each of the particles in the system does. We think of temperature as something as some average over velocities (squared). But a single particle travelling its path through empty space would not have temperature. Neither would any directed motion of a beam of particles contribute to temperature. So temperature is better defined as the mean deviation of a distribution of speeds.

Even if we consider simple gas molecules, we could define different types of temperature: There is a kinetic temperature calculated from velocities. In the long run – when equilibrium has been reached – the other degrees of freedom (sich as rotations) would exhibit the same temperature. But when a gas is heated up, heat is transferred via collisions: So first the kinetic temperature rises, and then the energy is transferred to rotations. You could calculate a temperature from rotations, and this temperature would be different from the kinetic temperature.

So temperature is a property that is derived from what an incredible number of single particles do. It is a statistical property and it makes only sense when a system had enough time to reach an equilibrium. As soon as we push the microscopic constituents of the system that makes them deviate from their equilibrium behaviour, we get strange results for temperature – such as negative values.

_______________________
This post was also inspired by some interesting discussions on LinkedIn a while ago – on the second law and the nature of temperature.
(*) Edit 2 – Feb. 2: Though Clausius is known as the creator of the term entropy, the concept as such has been developed earlier by Rankine.

## 17 thoughts on “Random Thoughts on Temperature and Intuition in Thermodynamics”

1. How interesting! It can get quite difficult and frustrating when we use the same words and phrases in theoretical physics and in everyday language. However, I think it’s impossible to keep people from adopting the same words for use in everyday language. So we have to constantly work with this difficulty, making sure to disambiguate the everyday terms from the scientific ones… while also using some helpful, intuitive concepts that everyday language illuminates. This is why good science writing will always be needed to help avoid misunderstandings about science that relate to the everyday. Thanks for the great post!

• Thanks for commenting! You are right – as much as I would wish to describe “the real stuff” and as skeptical I had once been about science popularization … as much I agree with you now 🙂
Science bloggers as Aatish Bhatia prove that you can use compelling metaphors and still preserve the essence of a scientific argument.

• I am excited to see what you post about next! And I will definitely look at Aatish Bhatia’s writing – thank you.

• Thanks again! Chances are that I my next post will again be tagged with ‘weird’. I feel the urge to create some Search Term Poetry or Spam Poetry. It is amazing what kind of search terms make visitors hit my blog and how intricate some spam is 😉 You have to recycle those!

• I’ll stay tuned!

• Thanks – I went for Spam Poetry now 😉

2. Thanks, enjoyed reading this post. Concerning Clausius, the Wiki link says he coined the term entropy in 1865?

• Hi Peter! Of course – thanks a lot for the correction! I should not apply the typical physicist’s way of dealing with numbers (” … order of magnitude is OK … “) to historical dates!

• Corrected while preserving error record 😉

• Thanks for your great post on Rankine – http://carnotcycle.wordpress.com/2013/02/01/rankine-on-entropy-love-and-marriage/ – I just added one more remark at the bottom of my post. Your posts on historical papers in thermodynamics provided me with one more non-intuitive aspect of thermodynamics, applicable to any discipline: It is really hard to follow explanations of things we think we are familiar with when they are explained in terms and language rooted in older paradigms.
To all the sci-fi geeks reading this blog – fans of theory of relativity and stuff (waiting for articles on worm holes and quantum cryptography rather than thermodynamic machines…): If you want to get a taste of this (issue with intuiton), I would recommend trying to read On the Shoulders of Giants: http://www.amazon.com/On-The-Shoulders-Of-Giants/dp/076241698X/ref=sr_1_3?ie=UTF8&qid=1359761432&sr=8-3 – Copernicus, Kepler, Galilei, Newton, Einstein, original works.
As I just mentioned in a comment on Peter’s blog – even Feynman failed to follow Newton’s geometrical derivation of calculus – so this is not exactly an easy read. But it reminds us of the capabilities our scientific ancestors once had (e.g. the true mastery of geometry) and that we – generation grown up with computers and trained in abstract thinking (algebra / calculus) – have lost.

• Thank you Elke for the kind comment, and for putting the language of older scientific paradigms on the discussion agenda. When I started CarnotCycle, with its frequent focus on primary documentation from the 19th century, I didn’t think too much about it. But you have, and that is what matters.
A hundred years from now, I am sure that folk will similarly regard what you and I say today as the language of the ancients, although we see it as cutting-edge. I guess the same was true for the founders of thermodynamics in the 19th century, who surely must have considered themselves far advanced of their 18th century counterparts.
In any age, delivering the best available version of the truth is the best we can do. We necessarily express our knowledge in contemporary language which will become dated with the passage of time. But as a scientific endeavour, truth-seeking is independent of time. This is what binds the evolution of scientific enquiry, and the enlightenment we hope to gain from it, into a continuous process.
Our modern difficulty seems to be in understanding the language of our antecedents, so that we can affirm the time-independence of truth-seeking and properly see our own efforts as a part of this continuum?

• Thanks, Peter, for your thoughtful comment. I think our technical and science degree programs should include a bit of history of science and make students familiar with the way paradigms in their discipline have evolved over time.

I would be indeed interested in time-travelling to the future and learn about what our future readers will say about us. I wonder if the ‘the internet’, that is the accessibility of information, will make a difference? But I have once read that – despite all the worries about ‘Big Data’ – less information created today will finally survive, compared to books and papers written in the past. On the other hand big digital archives are being created that help to preserve e.g. videos that would otherwise decompose literally.

3. Thanks for offering links that could fall under the ‘science for newbies’ banner. 🙂

• Thanks, Michelle! I really like the economic metaphors as they are understandable and the underlying math is represented correctly. MinutePhysics is awesome (all of their videos) – it is unbelievable what they are able to convey in a 3 minutes video. Both is funny, as I am usually neither “into metaphor” nor “into cartoons”.

4. Oh God, you brought the steam table and the tripple point, I’m melting … next you’ll bring in fluidomechanics … then what ever is left of me can be evacuated 🙂
(note: last summer I brought all my former notes on thermo and fluido to the recycle parc. I felt a devilish satisfaction while doing so) 🙂

• Thanks! My text books are sacred – will probably never be recycled 🙂
Re fluidomechanics: I had really considered to write about wind turbines and how to calculate the theoretical maximum efficiency – still on my to-be-blogged-about list!

• /-)

This site uses Akismet to reduce spam. Learn how your comment data is processed.