• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

A way to think about entropy -- sharing (2013)

Page history last edited by Joe Redish 4 years, 11 months ago

Working Content

 

Prerequisites:

 

Recap: The main idea of entropy and the 2nd law

We've now read a lot about the second law of thermodynamics and the idea of entropy. The basic idea was that we are looking coarsely at a system that has a fine-grained structure that is changing rapidly in a random way. Specifically, we are looking at things macroscopically -- and by this we mean at a level at which the structure of matter in terms of molecules and atoms can't be seen. We only care about average properties of the molecules; things like temperature, pressure, and concentration. It is generally useful to ignore the fact that the molecules are actually moving around chaotically, colliding with each other, and chemical reactions are happening (and unhappening).

 

What we are interested in, is the following:

 

If two parts of a system we are considering are NOT in thermodynamic equilibrium, what will naturally tend to happen?

 

This is the question that the second law of thermodynamics gives the answer to. It tells us that if one part of a system is hotter than another, the natural spontaneous tendency of the system is for the temperature to even out. If a chemical reaction can occur, the reaction will continue in one direction until the rate of the reverse reaction is equal to the rate of the reaction. When the rates of forward and reverse reactions are equal, the amount of each chemical stays the same and it is called chemical equilibrium.

 

To understand these situations in general and to figure out which way things will happen under what conditions, we introduced the concept of entropy and the second law of thermodynamics. The core idea is that to each coarse-grained view of a particular system (its pressure, temperature, chemical concentrations, etc. -- its macrostate) there are many, many different possible possible arrangements and motions of the individual molecules (its microstate). The idea of the second law is:

 

A system that is not in thermodynamic equilibrium will spontaneously go towards the state with the largest number of microstates.

 

The reason for this is that we assume that as the system goes through its various chaotic states, each microstate is equally probable. Therefore, the system will most often wind up in the macrostate that corresponds to the largest number of microstates.

 

Since the entropy is defined as (a constant times) the logarithm of the number of microstates, S = kB ln(W), (see Why entropy is logarithmic), the second law can be restated as

 

Systems that are not in thermodynamic equilibrium will spontaneously transform so as to increase the entropy.

 

Well. This is an impressive sounding statement. But what does it mean? It's pretty plausible to think about flipping coins and deciding whether 5 heads and 5 tails is more or less likely to happen than tossing 10 heads in a row. But how does counting microstates help us see that hot and cold objects placed together will tend to go to a common temperature? You can do it, but it takes a LOT of heavy mathematical lifting -- and doesn't particularly help us conceptually. Another way to think about it that might help, is to think of entropy as a measure of sharing.

 

If energy is uniformly spread, it's useless.

Thermodynamic equilibrium means that the energy in a system is uniformly spread among all the degrees of freedom (i.e. distributed evenly among all places energy can go, for example, for each molecule among both its potential and kinetic energies). In such a state, the energy no longer "flows" from one set of molecules to another or from one kind of energy to another. Thus in thermodynamic equilibrium the energy in a system is useless; no work, either physical or chemical can be done. If we want to think about how useful some energy is, we need to know not just how much energy we have, but how it is distributed. The further from equilibrium it is, the more useful it will be. We are working towards developing the idea of not just energy, but free energy -- useful energy.

 

In some sense, entropy is a measure of how uniformly the energy is distributed in a system. If the system is fully at thermodynamic equilibrium the entropy is a maximum. If the entropy is lower than maximum, then there is room for the entropy to go up as the system moves towards thermodynamic equilibrium. The system will spontaneously and naturally be redistributing its energy toward the equilibrium state. During such a redistribution, work can get done and an organism can make a living.

 

Follow-on:

 

 

Joe Redish 1/29/12

Comments (0)

You don't have permission to comment on this page.