
The Misunderstood Nature of Entropy
Season 4 Episode 32 | 10m 58sVideo has Closed Captions
Entropy is an intriguing and widely misunderstood concept.
Entropy is surely one of the most intriguing and misunderstood concepts in all of physics. The entropy of the universe must always increase – so says the second law of thermodynamics. It’s a law that seems emergent from deeper laws – it’s statistical in nature – and yet may ultimately be more fundamental and unavoidable than any other law of physics.
Problems with Closed Captions? Closed Captioning Feedback
Problems with Closed Captions? Closed Captioning Feedback

The Misunderstood Nature of Entropy
Season 4 Episode 32 | 10m 58sVideo has Closed Captions
Entropy is surely one of the most intriguing and misunderstood concepts in all of physics. The entropy of the universe must always increase – so says the second law of thermodynamics. It’s a law that seems emergent from deeper laws – it’s statistical in nature – and yet may ultimately be more fundamental and unavoidable than any other law of physics.
Problems with Closed Captions? Closed Captioning Feedback
How to Watch PBS Space Time
PBS Space Time is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship[MUSIC PLAYING] Entropy and the second law of thermodynamics have been credited with defying the arrow of time, predicting the ultimate heat death of the universe, and providing the driving force for the development of structure as well as decay and also excusing the messiness of your room.
But what is entropy really, and how fundamental is it to our universe?
[MUSIC PLAYING] Entropy is surely one of the most intriguing and misunderstood concepts in all of physics.
The entropy of the universe must always increase, so says the second law of thermodynamics.
It's a law that seems emergent from deeper laws.
It's statistical in nature, and yet, may ultimately be more fundamental and unavoidable than any other law in physics.
Einstein said that "the thermodynamics that encapsulates the second law is the only physical theory of universal content which I am convinced will never be overthrown."
And the great astrophysicist Sir Arthur Eddington warned, "if your theory is found to be against the second law of thermodynamics, I can give you no hope.
There is nothing for it but to collapse in deepest humiliation."
We've looked at entropy in the past, but it's time to go much deeper to unravel the great unraveler.
Over some upcoming episodes, we'll explore different aspects and consequences of entropy, including its role in black hole thermodynamics and how it will lead to the end of our universe.
But today, we'll see what entropy really is and why the second law of thermodynamics is considered to be so fundamental and so unavoidable.
Let's start from the beginning.
In 1824, Sadi Carnot published his "Reflections on the Motive Power of Fire," in which he revealed the theory for perfect engine efficiency.
Heat engines, which in Carnot's day, were the new-fangled steam engines, worked by turning the flow of heat energy into mechanical energy.
For heat to flow, you need two reservoirs of different temperature.
A perfectly efficient engine, one undergoing the Carnot cycle, converts all transferred heat energy into useful work.
In principle, that work can then be converted back into heat and so the temperature differential can be reestablished.
However, an inefficient engine will slowly deplete the difference in temperature, reducing the heat flow, and the engine winds down.
Around a half century after Carnot, Rudolf Clausius was inspired to quantify this tendency of heat energies to decay over time, enter entropy.
Clausius defined entropy as the internal property that changes as heat energy moves around within a system.
Specifically, the change in entropy of each reservoir is the heat energy going into or out of that reservoir divided by its temperature.
For a Carnot cycle, the overall change in entropy is zero but for any less efficient cycle, entropy increases.
In fact, an increase in entropy means that the heat reservoirs are approaching the same temperature, reducing the capacity to do useful work.
Carnot and Clausius' work revealed entropy as a measure of how evenly spread out a system's energy is.
The more evenly spread, the less useful the energy is, and for an isolated system, the best you can hope for is that the separation of energy and the entropy remain constant.
In reality, it will almost always increase unless energy comes in from the outside to reestablish the temperature differential.
This understanding of entropy is in terms of flowing heat, and it came from the days when many, including Carnot himself, believed that heat was a physical fluid called caloric.
It took a revolution to understand the reality of entropy, that revolution was statistical mechanics, founded by the great Ludwig Boltzmann with his kinetic theory of gases.
This theory explained thermodynamic behavior as the summed result of the individual motion of tiny particles under Newton's laws of motion.
Stat mech is really astounding.
It's founded on an absurdly simple idea.
For a given set of large-scale observable properties, every possible configuration of particles that could give those properties is equally likely.
Let's add some physics speak.
By configuration, I mean the exact arrangement, of positions, velocities, et cetera of all microscopic particles.
We call this the microstate.
And we call the specific combination of large-scale macroscopic properties the macrostate.
Macrostates are entirely defined by thermodynamic properties, temperature, pressure, volume, and number of particles.
For a given macrostate, all microstates consistent with its thermodynamic properties are equally likely.
For some macrostates, there are lots of different microstates or arrangements of particles that lead to roughly the same thermodynamic properties, while other macrostates can be produced by only very few microstates.
OK. One more fact-- if you leave a system to do its own thing, it'll eventually try out all possible microstates that are possible given the laws of physics.
All particle arrangements will eventually happen.
So if you look at the system at some random point in time, it'll be in a completely random microstate chosen from all possible microstates.
And what macrostate will it be in?
Well, probably the one that's consistent with the most microstates.
We can think of these micro and macrostates with an analogy.
This is a Go board.
Let's say you place 180 black stones at random.
Every possible specific arrangement is considered a microstate, while the overall shape of the distribution would be the macrostate.
There are nearly 2 times 10 to the power of 107 ways to arrange the pieces and almost all of them are pretty evenly mixed, so roughly all the same macrostate.
Some microstates are weird though, and they give different macrostates because they're different average distributions.
For example, there's one where all of the stones are on one side.
That microstate is a factor of 2 times 10 to the power of 107 less likely than one of the many smoothly mixed microstates.
And the larger the board, the less likely it is to end up in such a weird arrangement.
For a room full of 10 to the power of 26 molecules of air, the chance of getting all of the molecules on one side of the room by chance is so small that it never happens.
We've been talking a lot about particle position, but really, that Go board is an analogy for all possible combinations of all properties-- position, momentum, spin, vibration, really any degree of freedom that the system can have.
We call this space of properties a phase space.
And instead of particles being distributed through position space, a microstate is really defined by how energy is distributed through phase space.
The average distribution of individual particles in phase space defines the thermodynamic properties of the system.
That's why these similarly-shaped distributions on the Go board correspond to the same macrostate, while the clustered spread does not.
OK.
So if you leave a system alone long enough, its particles and its energy will find its way into all the different forms that are possible.
The vast majority of possible distributions of energy leave the system very close to a single macrostate, that's the state of thermal equilibrium, in which energy is maximally spread out and temperature, pressure, density, volume, et cetera have the values we expect from classical thermodynamics.
Statistical mechanics tells us why large-scale systems have the properties they do, but what does this have to do with entropy?
Well, Ludwig Boltzmann figured that out, too.
The Boltzmann equation tells us that entropy is the logarithm of the number of microstates consistent with the current macrostate times the Boltzmann constant.
So our smoothly spread out equilibrium Go board has a high entropy and our clustered board has low entropy.
By the way, there are certain special microstates, special arrangements of particles that look highly ordered but are still consistent with their high-entropy microstate.
For example, if we try to draw pictures or write words in phase space, this is where we get to a point of common confusion.
Order is not the same thing as low entropy, and the second law isn't always the tendency towards disorder.
In thermodynamic entropy, the only special arrangements of particles that change entropy are the ones that change the thermodynamic properties, not the ones that spell out cuss words or mess up your room.
To get deeper into that, we'll need to talk about information entropy, which we'll also need for black hole thermodynamics and will take another episode.
OK.
So the macrostate that defines thermodynamic equilibrium is, by definition, the one with the most microstates, which also means the maximum entropy.
Any system not in equilibrium must increase in entropy, simply because at any future time, it's current microstate will most likely be one of the more common types of microstate.
This is assuming you don't force the system from the outside.
I mean, it's possible to take each Go stone and place it on a particular spot to construct a special microstate or to use a vacuum pump and a glass wall to move all of the air to one side of the room.
In both cases, you are reducing the number of accessible microstates which, by definition, must reduce entropy.
But to do so, you must introduce an external source of energy.
Heat must flow between your system and the outside universe in a way that increases the entropy of the universe as a whole.
Statistical mechanics inevitably leads to entropy and the second law, and it does so by something so fundamental and basic that it's impossible to deny.
It comes from counting the ways that energy can be distributed.
The inevitability of the rise of entropy is as fundamental as counting, that's why Einstein and Eddington were so sure of it.
But entropy is also statistical and emerges from behavior of particles under the laws of motion.
This is where the second law appears to add something new to the universe not seen in the more fundamental laws.
It seems to add the arrow of time.
See, the laws of motion, whether Newtonian or quantum mechanical, don't care about the direction of time, and yet, the second law of thermodynamics clearly distinguishes between the past and the future.
We talked a little about this in our episode on the physics of life, where we saw how entropy drives both the increase and decay of complexity.
It's almost like the concept of time is emergent and statistical, just like entropy.
Again, we'll delve deeper into this in the future, but for now, please be careful to keep your number of accessible microstates low, avoid thermal equilibrium, and keep being that brilliant macrostate that is you, until I see you next week on "Space Time."