• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Boltzmann distribution

Page history last edited by Joe Redish 8 years, 9 months ago

Class content > Thermodynamics and statistical physics

 

Prerequisites:

 

The Second Law of Thermodynamics says that in response to random interactions (such as molecular collisions), an isolated system will tend to move toward more probable states.  To demonstrate this, we used an example with coin tosses, saying that the arrangements HHHHHTTTTT and HHHHHHHHHH were equally probable, but there are many more ways to get 5 heads and 5 tails than to get 10 heads and 0 tails.  But real-life situations can be more complicated than coin tosses!  Sometimes each arrangement is not equally probable.  In particular, in a physical system, the probability of an arrangement may depend on its energy.  Therefore, to figure out what's going to happen when there are different energies involved, we need to take into account both energy and entropy.

 

For example, when two molecules collide in an thermal environment, some of the kinetic energy might go into the internal energy of one or both of the molecules. It could set the molecule vibrating or spinning, putting energy into the internal degrees of freedom of the molecule. So for all the molecules of a particular type in a block of matter, we expect that some of them will have more energy in their rotational degree of freedom than others. Let's try to figure out how the probability of an arrangement depends on its energy.

 

What we can infer from how two systems must combine

Let's imagine splitting our system into two parts, which we'll call A and B.  Now we can make two observations:

  • The energy of the entire system equals the energy of subsystem A plus the energy of subsystem B, or if you like, EAB = EA + EB .  This should need no explanation!
  • The two parts of the system are each in some arrangement.  If PAX is the probability that subsystem A is in arrangement X, and PBY is the probability that subsystem B is in arrangement Y, then what is the probability that the entire system (AB) is in this combined arrangement (XY)?  From the rules of probability, we know that we have to multiply them: P(AB)(XY) = (PAX * PBY).  For example, if you flip a coin and roll a die at the same time, the probability that you both flip tails and roll 5 is (1/2)*(1/6) = (1/12).  (This is related to the principle that you saw in this '80s music video.)

 

In summary,

 

When systems combine, energies add and probabilities multiply. 

 

So if we express the probability of a given state as a function of the energy, it must satisfy the relationship P(EA + EB) = P(EA) * P(EB) for all possible values of EA and EB.

 

Creating a distribution that has the right properties

We need a probability distribution for energy that satisfies that when you add the energies in two parts of the system, the probabilities multiply. What types of functions satisfy this relationship?  Let's try exponential, since we know ex+y = ex * ey for all x and y.

 

So can we just say the probability of arrangement i (with energy Ei) is just P(i) = eEi ?  No way!  Ei has units of energy, and you can't take an exponential of a quantity that has units. You wouldn't want to get a different result if you calculated in calories instead of Joules! The exponent has to be dimensionless.  So we have to divide Ei (in the exponent) by something else with units of energy so that the units cancel.  Where can we get another energy?  Is there a natural energy scale to which we can compare the energy of a given arrangement (so that we end up with just a dimensionless ratio)?  Yes, there is:  As we saw before, we can use the Boltzmann constant kB to convert the temperature of the system into an energy, so kBT has units of energy, and Ei/kBT is dimensionless.

 

So can we say that the probability function is P(i) = eEi/kBT?  That's not obviously wrong in the way that eEi is - at least the expression makes mathematical sense.  But let's play the implications game to see whether it makes physical sense.  If you plug in numbers to this expression, you see that an arrangement with greater energy has a greater probability than another arrangement (at the same temperature) with less energy.  But that can't be right.  It should be more difficult to get to an arrangement with greater energy (and therefore less probable).

 

Let's try putting in a negative sign:  P(i) = e-Ei/kBTNow does this make physical sense?  A graph of the relationship between probability and energy (at a given temperature) is shown at the right.

 

So as the energy of an arrangement increases, the probability decreases.  At very high energies, the probability never quite reaches zero, but becomes very small.  This distribution is known as the Boltzmann distribution.

 

Now let's see how this probability distribution depends on temperature.  T is in the denominator inside the exponent (not something we're accustomed to seeing!).  As T increases, the denominator gets larger, which means the fraction Ei/kBT gets smaller.  But this fraction in the exponent is negative, so the exponent actually gets less negative, i.e. larger, so the probability of a given arrangement increases as you go to higher temperature.  In other words, raising the temperature of the whole system makes it easier to get to higher-energy arrangements.  This relationship looks like the figure at the right.


 

There's just one part of this result that can't possibly be right:  Try plugging in 0 for the energy, and you get e0 = 1 for the probability, suggesting that there is a 100% probability that the system will be in a state with zero energy!  Impossible, because the sum of all the probabilities has to add up to 1.  To fix this, we can put a normalization constant out in front:  P(i) = c e-Ei/kBTBut we won't worry about what this constant actually is, because we're mainly concerned about the relative probabilities of two different states at the same temperature, and when we take the ratio of two probabilities, the constant cancels out:  P(i)/P(j) = e-(Ei-Ej)/kBT

 

For example, at room temperature (300 K), kBT = 1/40 eV = 4.14 x 10-21 J or RT = NAkBT = 2.5 kJ/mol.  So if there are two different arrangements with an energy difference of 5 kJ/mol between them, then the lower-energy arrangement is 7.4 times as probable as the higher-energy arrangement.  (Run the numbers for yourself to check this.)  However, if we heat the system up to 600 K, then the lower-energy arrangement is only 2.7 times as probable: it becomes easier to access the higher-energy arrangement.

 

Here's what the Boltzmann distribution looks like at different temperatures:

 

Note that at different T's at high energies of excitation, the ordering is as we would expect. Higher excitation energies are more probable at higher temperatures. But at low energy is works the opposite! This is because of the normalization constant, c, which depends on temperature. A good way of thinking about it is this: If we start at a temperature T1, we have the distribution shown. If we add heat and raise the temp, the higher energy excitations get more probable. But that probability has to come from somewhere! It comes from depleting the highly probable low energy excitations and moving them up to a higher energy. Since we are plotting a probability, the area under the curve has to be 1. If we raise it at the high end, we have to lower it at the low end.

 

Postscript

The Boltzmann distribution only applies to individual arrangements of particles.  But if there are multiple arrangements that give the same overall energy (just as we showed that there are multiple ways to flip 10 coins and get 5 heads and 5 tails), we still need to multiply the probability of each arrangement by the number of arrangements.  We'll explore the consequences of combining energy and entropy (probability) in the follow-on page. 

 

Follow-ons:

 

 

Ben Dreyfus 11/29/11

Comments (7)

Marco said

at 6:43 am on Jul 20, 2015

I think the T3 and T1-lines should be the other way around!

Joe Redish said

at 7:53 am on Jul 20, 2015

Nope. Look at the results at higher energy. The probability of getting a high energy is greater at a higher temperature. (The T3 line is above the T2 line is about the T1 line at high energy.)

Marco said

at 8:32 am on Jul 20, 2015

Using P(Ei) = c . e-Ei/kBT at E is (almost) 0 e becomes (almost) 1, at all temperatures. So from E is (almost) 0 on T2 will always be above T1 and T3 always above T2. In my opinion that's just a mathematical consequence.

Joe Redish said

at 8:46 am on Jul 20, 2015

You've mis-done your math. When E is 0 P(0) becomes c, not 1. And since it's a probability, c is adjusted to make the integral 1. As T increases, more of the probability goes to higher energy so c decreases as is shown in the figure. In fact doing the integral explicitly shows that c and therefore the probability of finding 0 energy is proportional to 1/T.

Joe Redish said

at 9:00 am on Jul 20, 2015

Explicitly, the integral of e^(-At) from 0 to infinity is 1/A, so to normalize the integral to 1 you have to multiply by A. The BD is therefore explicitly P(E) = (1/kT)e^(-E/kT).

Marco said

at 9:07 am on Jul 20, 2015

Ahhhh, I see! I missed out on the normalisation (or that c is a function of time). Thank you so much for your reactions!!!

Marco said

at 9:55 am on Jul 20, 2015

I meant ... a function of temperature ... of course ;-)

You don't have permission to comment on this page.