You are on page 1of 78

Chapter 2

Statistical Description
of Systems of Particles
Discussion of the General Problem
• We’ve reviewed elementary probability & statistics.
• Now, we are ready to talk about
PHYSICS
• In this chapter (& the rest of the course) we’ll combine statistical
ideas with the

Laws of Classical or Quantum Mechanics


≡ Statistical Mechanics
We can use either the classical or the quantum
description of a system.
Of course, which is valid obviously depends on the problem
Four Essential Ingredients
for a Statistical Description of a Physical System
with many particles (~ an outline of Ch. 2!)
1. Specification of the State (“macrostate”) of the system.
We need a detailed method for doing this. This is discussed in this chapter.
2. Statistical Ensemble:
We need to decide exactly which ensemble to use.
This is also discussed in this chapter.
– In either Classical Mechanics or Quantum Mechanics:
• If we had a detailed knowledge of all positions & momenta of all
system particles & if we knew all inter-particle forces, we could (in principle)
set up & solve the coupled, non-linear differential equations of motion. We
could find EXACTLY the behavior of all particles for all time!
• In practice we don’t have this information. Even if we did, such a
problem is impractical, if not impossible, to solve!
• Instead, we’ll use statistical/probabilistic methods.

Ensemble
Now, lets think of doing MANY (≡ N) similar experiments
on the system of particles we are considering.
– In general, the outcome of each experiment will be different.
– So, we ask for the PROBABILITY of a particular
outcome. This PROBABILITY ≡ the fraction of cases
out of N experiments which have that outcome.
– This is how probability is determined by experiment.
– One of the goals of
STATISTICAL MECHANICS
is to predict this probability theoretically.
• Next, we need to start somewhere with a theory. So we need to assume

3. A Basic Postulate about à-priori Probabilities.


“à-priori” ≡ prior (based on our prior knowledge of the system)
– Our knowledge of a given physical system leads is to expect that there is
NOTHING in the laws of mechanics (classical or quantum) which would
result in the system preferring to be in any particular one of it’s
Accessible States.
– So, (if we have no contrary experimental evidence) we make the hypothesis
that it is equally probable (or equally likely) that the system is in
ANY ONE of it’s accessible states. This postulate seems reasonable &
doesn’t contradict any laws of mechanics (classical or quantum). But is it correct?
– It can only be confirmed by checking theoretical predictions &
comparing those to experimental observation!
Physics is an experimental science!!
Sometimes, 3. is called the Basic Postulate of Statistical Mechanics!
Finally, we can do some calculations!
4. Probability Calculations

– Once we have the Basic


Postulate, we can use
Probability Theory to predict the outcome of
experiments.

Now, we will go through steps 1., 2., 3., 4. in detail!


Statistical Formulation of the Mechanical Problem
Sect. 2.1: Specification of the State of a System
“State” ≡ Macrostate
• Consider any system of particles. We know that the particles will
obey the laws of Quantum Mechanics (we’ll discuss the Classical
description shortly). We’ll emphasize the Quantum treatment.
• Specifically, a system with f degrees of freedom can be described
by a (many particle!) wavefunction Ψ(q1,q2,….qf,t), where
q1,q2,….qf ≡ a set of f generalized coordinates which are required
to characterize the system (needn’t be position coordinates!)
– A particular quantum state (macrostate) of the system is specified by giving
values of some set of f quantum numbers.
– If we specify Ψ at a given time t, we can (in principle) calculate it at any
later time by solving the appropriate Schrödinger Equation.
– Now, lets look at some simple examples, which might also review a little
elementary Quantum Mechanics for you.
Example 1
• The system is a single particle, fixed in position.
• It has intrinsic spin ½ (intrinsic angular momentum = ½ћ).
• In the Quantum Description of this system, the
state of the particle is specified by specifying the
projection m of this spin along a fixed axis
(which we usually call the z-axis).
• The quantum number m can thus have 2 values:
½ (“spin up”) or -½ (“spin down”)
So, there are 2 possible states of the system.
Example 2
• The system is N particles (non-interacting), fixed in
position. Each has intrinsic spin ½ so EACH particle’s
quantum number mi (i = 1,2,…N) can have one of the 2
values ½. Suppose that N is HUGE, N ~ 1024.
• The state of this system is then specified by specifying the
values of EACH of the quantum numbers m,1,m2, .. mN.
 There are (2)N unique states of the system!
With N ~ 1024, this number is
HUGE!!!
Example 3
• The system is a quantum mechanical, one-dimensional,
simple harmonic oscillator, with position coordinate x &
classical frequency ω.
So the Quantum Energy of this system is:
E = ћω(n + ½). (n = 0,1,2,3,….).
The quantum states of this oscillator are specified by
specifying the quantum number n.
There are essentially an
 NUMBER
of such states!
Example 4
• The system is N quantum mechanical, one-dimensional,
simple harmonic oscillators, at positions xi, with
classical frequencies ωi (i = 1,2,.. N).
So the Quantum Energies of each particle in this system are:
Ei = ћωi(ni + ½). (ni = 0,1,2,3,….).
The system’s quantum states are specified by specifying
the values of each of the quantum numbers ni.
Here also, there are essentially an
 NUMBER of such states
but, there are a much larger number of these than in Example 3!
Example 5
• The system is one particle, of mass m, confined to a rectangular
box, but otherwise free. Taking the origin at a corner:
0 ≤ x ≤ Lx, 0 ≤ y ≤ Ly, 0 ≤ z ≤ Lz
The particle is described by the QM wavefunction ψ(x,y,z), a solution to the
Schrödinger Equation
[-ћ2/(2m)][(∂2/∂x2) + (∂2/∂y2) + (∂2/∂z2)]ψ(x,y,z) = Eψ(x,y,z)
Using the boundary condition that ψ = 0 on the box faces, it can be shown that:
ψ(x,y,z) = [8/(LxLyLz)]½sin(nxπ/Lx)sin(nyπ/Ly)sin(nzπ/Lz)
nx, ny, nz are 3 quantum numbers (positive or negative integers).
The particle Quantum Energy is:
E = [(ћ2π2)/(2m)][(nx2/Lx2) + (ny2/Ly2) + (nz2/Lz2)
The quantum states of this system are found by specifying the values of nx, ny, nz.
Again, there are essentially also an  number of such states.
Example 6
• The system is N particles, non-interacting, of mass m, confined to a
rectangular box, but otherwise free. Take the origin at a corner
0 ≤ x ≤ Lx, 0 ≤ y ≤ Ly, 0 ≤ z ≤ Lz.
Since they are non-interacting, each particle is described by the QM
wavefunction ψi(x,y,z), which is a solution to the
Schrödinger Equation
[-ћ2/(2m)][(∂2/∂x2) + (∂2/∂y2) + (∂2/∂z2)]ψi(x,y,z) = Eiψi(x,y,z)
Using the boundary condition that ψ = 0 on the box faces, it can be shown that:
ψi(x,y,z) = [8/(LxLyLz)]½sin(nxπ/Lx)sin(nyπ/Ly)sin(nzπ/Lz)
nx, ny, nz are 3 quantum numbers (positive or negative integers).
Each particle’s Quantum Energy is:
E = [(ћ2π2)/(2m)][(nx2/Lx2) + (ny2/Ly2) + (nz2/Lz2)
The quantum states of this system are found by specifying the values of nx, ny, nz. for
each particle. Again, there are essentially also an  number of such states.
What about the
Classical Description
of the state of a System?
Of course, the
Quantum Description
is always correct!
However, it is often useful & convenient to make the
Classical Approximation.
How do we specify the state
of the system then?
• Lets start with a very simple case:
• Consider a Single Particle in 1 Dimension
– In classical mechanics, it can be completely described in terms of
it’s generalized position coordinate q & it’s momentum p.
• The usual case is to consider the
Hamiltonian Formulation
of classical mechanics, where we talk of generalized
coordinates q & generalized momenta p, rather than the
Lagrangian Formulation,
where we talk of coordinates q & velocities (dq/dt).
– Of course, the particle obeys
Newton’s 2nd Law
under the action of the forces on it. Equivalently, it obeys
Hamilton’s Equations of Motion.
• So q & p completely describe the particle. Given q, p at
any initial time (say, t = 0), they can be determined at
any other time t by integrating the (Newton’s 2nd Law)
Equations of Motion
forward in time.
 Knowing q & p at t = 0 in principle allows us
to know them for all time t.
 q & p completely describe the particle for all time.
• This situation can be abstractly represented in the
following geometric way:
• Consider the (abstract) 2-dimensional space defined by q, p:
≡ “Classical Phase Space” of the particle. (Figure)
At any time t, stating the (q, p) of the
particle describes it’s
“State”.
Specification of the
“State of the Particle”
is done by stating which point in
this plane the particle “occupies”.

Of course, as q & p change in time, according to the equation of motion,


the point representing the particle “State” moves in the plane.
• q, p are continuous variables, so an  number of points are in this
Classical Phase Space.
• We’d like to describe the particle “State” classically in a way that
the number of states are countable.

 It is convenient to subdivide the ranges


of q & p into small rectangles of size
q  p.

Then, think of this 2-d phase space as divided into cells of equal area:

qp ≡ ho
ho ≡ small constant with units of angular momentum (or “Action”).
• 2-d phase space cells of area: qp = ho.
• The particle “State” (classical) is specified by stating which
cell in phase space the q, p of the particle is in. Or, by
stating that it’s coordinate lies between q & q + q & that
it’s momentum lies between p & p + p.

“State” ≡
phase space cell labeled by
the (q,p) that the particle
occupies.
• This involves the “small” parameter ho, which is somewhat
arbitrary. However, we can use Quantum Mechanics &
the Heisenberg Uncertainty Principle:
“It is impossible to SIMULTANEOUSLY
specify a particle’s position & momentum
to a greater accuracy than qp ≥ ½ћ”
• So, the minimum value of ho is clearly ½ћ.
As ho ½ћ, the classical description of
the State the quantum description. &
becomes more & more accurate.
• Now!! Lets generalize all of this to a
MANY PARTICLE SYSTEM
– 1 particle in 1 dimension means we have to deal with
2-dimensional phase space.
– The generalization to N particles is straightforward, but
requires thinking in terms of very abstract
multidimensional phase spaces.
• Consider a system with f degrees of freedom:
 The system is classically described by
f generalized coordinates: q1,q2,q3, …qf.
& f generalized momenta: p1,p2,p3, …pf.
 A complete description of the classical “State”
of the system requires the specification of:
f generalized coordinates: q1,q2,q3, …qf.
& f generalized momenta: p1,p2,p3, …pf.
(N particles, 3-dimensions  f = 3N!)
• So, now lets think VERY abstractly in terms of a
2f-dimensional phase space
The f generalized coordinates: q1,q2,q3, …qf.
& f generalized momenta: p1,p2,p3, …pf are regarded as a
point in the 2f-dimensional phase space of the system.
2f-dimensional phase space: f q’s & f p’s:
• Each q & each p label an axis (analogous to the
2-d phase space for 1 particle in 1 dimension).
• Again, we subdivide this phase space into small “cells” of
2f-dimensional “volume”:
q1q2q3…qfp1p2p3…p1f ≡ (ho)f

The classical “State” of the system


≡ the cell in this 2f-dimensional phase
space the system “occupies”.
• Reif, as all modern texts, takes this viewpoint that the system’s
“State” is described by a 2f-dimensional phase space
≡ “The Gibbs Viewpoint”: the system “State” ≡
the cell in this phase space the system “occupies”.
• Older texts take a different viewpoint
≡ “The Boltzmann Viewpoint”:
In this viewpoint, each particle moves in
it’s own 6-dimensional phase space &
the “State” of the system requires specifying each cell in
this phase space the each particle in
the system “occupies”.
Summary
Specification of the State of the System:

• In Quantum Mechanics:
– Enumerate & label all possible quantum states of the system.
• In Classical Mechanics:
– Specify which cell in 2f-dimensional phase space
(all coordinates & momenta of all particles) the system occupies.

As ho → ½ћ the classical & quantum


descriptions become the same.
Section 2.2: Statistical Ensemble
• As we know, Statistical Mechanics deals with the
behavior of systems of a large number of particles.
• Because the number of particles is so huge, we give up trying to
keep track of individual particles. We can’t solve
Schrödinger’s equation in closed form for helium (4 particles), so
what hope do we have of solving it for the gas molecules in this
room (10f particles).

• Statistical Mechanics handles many particles by


calculating the most probable behavior of the system
as a whole, rather than by being concerned with the
behavior of individual particles.
In statistical mechanics, We assume that the more ways there are to
arrange the particles to give a particular distribution of energies, the
more probable is that distribution. (This seems reasonable?)
Begin with an assumption that we believe describes nature.

6 units of energy, 3 particles to give it to

321 411
312 141
213 114
231 3 ways
123
132 more likely

6 ways
• In principle, the problem of a many particle
system is completely deterministic:
• If we specify the many particle wavefunction Ψ (state)
of the system (or the classical phase space cell) at time
t = 0, we can determine Ψ for all other times t by
solving the time-dependent Schrödinger Equation.
& from Ψ(t) we can calculate all observable quantities.
Or, classically, if we specify the positions &
momenta of all particles at time t = 0, we can
predict the future behavior of the system
by solving the coupled many particle
Newton’s 2nd Law equations of motion.
• In general, we usually don’t have such a
complete specification of the system available.
– We need f quantum numbers, but f ≈ 1024!
• Actually, we usually aren’t even interested in such
a complete microscopic description anyway. Instead,
we’re interested in predictions of
MACROSCOPIC properties.
 We use Probability & Statistics.
 To do this we need the concept of an
ENSEMBLE.
• A Statistical Ensemble is a LARGE number
(≡ N) of identically prepared systems.
– In general, the systems of this ensemble will
be in different states & thus will have different
macroscopic properties.
 We ask for the probability that a given
macroscopic parameter of interest
will have a certain value.
• A Goal or Aim of Statistical Mechanics is to
Predict this probability
• Example: Consider the spin problem again. But, now,
let the system have N = 3 particles, fixed in
position, each with spin = ½

 Each spin is either “up” (↑, m = ½)


or “down” (↓, m = -½).
• Each particle has a vector magnetic moment μ.
The projection of μ along a “z-axis” is either:
μz = μ, for spin “up”
μz = -μ, for spin “down”
• Now, put this system into an external magnetic field H.
– Classical E&M tells us that a particle with magnetic
moment μ in an external field H has energy:
ε = - μ∙H
– Let’s combine this with the Quantum Mechanical result:
 This tells us that each particle has 2 possible energies:
ε+ ≡ - μH for spin “up”
ε- ≡ μH for spin “down”
 So, for 3 particles, the State of the system
is specified by specifying m = 
 There are (2)3 = 8 POSSIBLE STATES
The Possible States of a 3 Spin System

Given that we know no other information about this system, all


we can say about it is that it has equal probability of
being found in any one of these 8 states.
• However, if (as is often the case in real problems) we have a partial
knowledge of the system (say, from experiment), then, we know that
the system can be only in any one of the states which are
COMPATIBLE with our knowledge
(That is, it can only be in one of it’s accessible states).

The “States Accessible to the System” ≡


are those states which are compatible with all of the
knowledge we have about the system.
It is important to use all the information we
have about the system!
Example
• For our 3 spin system, suppose that we measure the total
system energy & we find that
E ≡ - μH

• This additional information limits the states which are


accessible to the system.
• From the table, there are clearly only 3 states which
are compatible with this knowledge.
 The system must be in one of the 3 states:
(+,+,-) (+,-,+) (-,+,+)
Sect. 2.3: The Basic (or Fundamental)
Postulate of Statistical Mechanics
• Definition:
An ISOLATED
SYSTEM
≡ A system which has no interaction of any kind
with the “outside world”
• This is clearly, an idealization! Such a system has
No Exchange of Energy with the outside world.
 The laws of mechanics tell us that the
total energy E is conserved.
E ≡ Constant
• So, an isolated system is one for which
total energy is conserved
• Now, consider an isolated system. The total energy E is constant
& the system is characterized by this energy. So,
 All states accessible to it MUST have this energy E.
• For many particle systems, there are usually a HUGE number of
states with the same energy.
Question
What is the probability of finding the system in
any one of these accessible states?
• Before answering this, let’s define the term “Equilibrium”.
A System in Equilibrium is one for which the
Macroscopic Parameters characterizing it are
independent of time.
• Now, consider an isolated system in equilibrium:
• In the absence of any experimental data on some specific system
properties, all we can really say about this system is that it must
be in one of it’s accessible states (with that energy).
If this is all we know, we can “handwave” the following:
There is nothing in the laws of mechanics (classical or quantum) which would
lead us to suspect that, for an ensemble of similarly prepared systems,
we should find the system in some (or any one) of it’s accessible
states more frequently than in any of the others. So,

 It seems reasonable to ASSUME that the system is


EQUALLY LIKELY TO BE FOUND IN
ANY ONE OF IT’S ACCESSIBLE STATES
• In equilibrium statistical mechanics, we do make this assumption &
elevate it to the level of a POSTULATE.
THE FUNDAMENTAL (or BASIC)
POSTULATE OF (equilibrium)
STATISTICAL MECHANICS:

An isolated system in equilibrium is


EQUALLY LIKELY TO BE FOUND IN ANY
ONE OF IT’S ACCESSIBLE STATES

This is sometimes called


the Postulate of Equal à-priori Probabilities.
• This is the basic postulate (really the only postulate) of
equilibrium statistical mechanics.
• With this postulate, we can (& we will) derive
ALL of classical thermodynamics,
classical statistical mechanics,
& quantum statistical mechanics.
• It is reasonable & it doesn’t contradict any laws of classical or quantum
mechanics. But, is it valid & is it true? Remember that
PHYSICS IS AN EXPERIMENTAL SCIENCE
• Whether this postulate is valid or not can only be decided by
comparing the predictions of a theory based
on it with experimental data.
• A HUGE quantity of data exists! None has been found to be in
disagreement with the theory based on this postulate.
• So, lets accept it & continue on.
Some Simple Examples
Example 1
• Back to the previous example of 3 spins, an isolated system in
equilibrium. Suppose that the total energy is measured & found to be:
E ≡ - μH.
• We’ve seen that the only 3 possible system states consistent with this
energy are: (+,+,-) (+,-,+) (-,+,+)
• So  The Fundamental Postulate tells us that, when
the system is in equilibrium, it is equally likely
(with probability = ⅓) to be in any one of these 3 states.
• Note: This probability is about the system, it is NOT about
individual spins. Under these conditions, it is obviously NOT
equally likely that an individual spin is “up” & “down”. In fact, it is
twice as likely for a given spin to be “up” as “down”.
Example 2
• Consider N (~ 1024) spins, each with spin = ½. Put the
system in an external magnetic field H. Suppose that the
total energy is measured & found to be:
E ≡ - μH.
• This is similar to the 3 spin system, but now there are a HUGE
number of accessible states. The number of accessible states is
equal to the number of possible ways for the energy of N spins to
add up to - μH.

 The FUNDAMENTAL POSTULATE tells us that,


when the system is in equilibrium, it is equally likely
to be in any one of these HUGE number of states.
Example 3
• A classical illustration. Consider a 1-dimensional, classical,
simple harmonic oscillator mass m, with spring constant κ,
position x & momentum p. The total energy is:
E = ½(p2)/(m) + ½κx2 (1)
E is determined by the initial conditions.
If the oscillator is isolated, E is conserved.
• How do we find the number of accessible states for this oscillator?
• Consider the (x,p) phase space. In that space, E = constant, so (1) is the
equation of an ellipse:
p

E = constant 
x
• If we knew the oscillator energy E exactly, the accessible states
would be the points on the ellipse. In practice, we never know the
energy exactly! There is always an experimental error δE.
δE ≡ Uncertainty in the energy. Always assume:
|δE| <<< |E|
• For the geometrical picture in the x-p plane, this means that the energy
is somewhere between 2 ellipses, one corresponding to E & the other
corresponding to E + δE.

# accessible states ≡ # phase space


cells between the 2 ellipses ≡ (A/ho)
A ≡ area between ellipses, ho ≡ qp
• In general, there are many cells in the phase space area
between the ellipses (ho is “small”). So, there are
a LARGE NUMBER of accessible states
for the oscillator with energy between E & E + δE.
That is, there are many possible values of (x,p) for
a set of oscillators in an ensemble of such oscillators.
Fundamental Postulate of Statistical Mechanics:
 All possible values of (x,p)
with energy between E & E + δE are equally likely.
• Stated another way, ANY CELL in phase space
between the ellipses is equally likely.
Approach to Equilibrium
The Fundamental Postulate:
An ISOLATED system IN EQUILIBRIUM is
equally likely to be an any one of it’s accessible states.
– The
Fundamental Postulate of Statistical Mechanics.

• Suppose that we know that in a certain situation,


a particular system is NOT equally likely to be
in any one of it’s accessible states.
– Is this a violation of the Fundamental Postulate?
NO!!
• But, in this situation we can use the Fundamental
Postulate to infer that either:
1. The system is NOT ISOLATED
Or
2. The system is NOT IN EQUILIBRIUM
In this course, we’ll spend a lot of time discussing item 1.:
That is, we’ll discuss systems which are not isolated.

Now, here, we’ll very BRIEFLY discuss item 2.:


Systems which are not in equilibrium.
NON-EQULIBRIUM Statistical Mechanics:
This is still a subject of research in the 21st Century.
It is sometimes called
Irreversible Statistical Mechanics

• If a system is not in equilibrium, we expect the


situation to be a time-dependent one.

• That is, the average values of various


macroscopic parameters will be time-dependent.
• Suppose, at time t = 0, an ISOLATED system is
known to be in only a subset of the states accessible to it.

• There are no restrictions which would then prevent


the system from being found in ANY ONE of it’s
accessible states at some time t > 0 later.

• Therefore, it is very improbable that the system


will remain in this subset of its accessible states.
What will happen?
• The system will change with time due to
interactions between the particles.
– It will make transitions between its various accessible states.
– After a long time, we would expect an ensemble of similar
systems to be uniformly distributed over the accessible states.
– Equilibrium will be reached if we wait “long enough”.
– After that time, the system will be equally likely to be found in
any one of it’s accessible states.
What will happen?
• The system will change with time due to
interactions between the particles.
– It will make transitions between its various accessible states.
– After a long time, we would expect an ensemble of similar
systems to be uniformly distributed over the accessible states.
– Equilibrium will be reached if we wait “long enough”.
– After that time, the system will be equally likely to be found in
any one of it’s accessible states.
How long is “long enough”?
What will happen?
• The system will change with time due to
interactions between the particles.
– It will make transitions between its various accessible states.
– After a long time, we would expect an ensemble of similar
systems to be uniformly distributed over the accessible states.
– Equilibrium will be reached if we wait “long enough”.
– After that time, the system will be equally likely to be found in
any one of it’s accessible states.
How long is “long enough”?
This depends on the system. It could be femtoseconds,
nanoseconds, centuries, or billions of years!
A principle of Non-Equilibrium Statistical Mechanics:

“All isolated systems will, after a


‘sufficient time’, approach equilibrium”

≡ “Boltzmann’s H-Theorem”

Non-Equilibrium Statistical Mechanics


Example 1
The 3 spin system in an external magnetic field again.
• Suppose that we know that the total energy is E = -μH.
• Suppose that we prepare the system so it is in the state
(+,+,-)
Recall that:
This is only 1 of the 3 accessible states consistent with this energy.
Now allow some “small” interactions between the spins. These can “flip” them.
 We expect that, after a long enough time, an
ensemble of similar systems will be found with
equal probability in any one of it’s 3 accessible states:
(+,+,-), (+,-,+), (-,+,+)
Example 2: A gas in a container. The container is divided
by a partition into 2 equal volumes V. It is in equilibrium &
confined by the partition to the left side. See the Figure.

Gas Vacuum

Now, Remove the partition.


After this, the new situation is clearly NOT an equilibrium one.
All accessible states in the right side are NOT filled.
Now, wait some time. As a result of collisions between the molecules,
they eventually will distribute themselves uniformly over the
entire volume of 2V. This will be the final equilibrium situation.
Section 2.4: Probability Calculations
• To calculate the probability of finding a system in a given state:
Use the Fundamental Postulate of Statistical Mechanics:
An isolated system in equilibrium is equally likely
to be found in any one of it’s accessible states.
• There will always be an uncertainty in our knowledge of the
system energy ≡ δE. Suppose that we know that the energy of the
system is in the range E to E + δE.
• Define:
Ω(E) ≡ The total number of accessible states in this range.
y ≡ A macroscopic system parameter
(pressure, magnetic moment, etc.).
Ω(E;yk) ≡ a subset of Ω(E) for which y ≡ yk
(a particular value of y)
• Let P(y = yk) ≡ the probability that y ≡ yk. The
Fundamental Postulate of Statistical Mechanics:
 P(y = yk) ≡ [Ω(E;yk)/Ω(E)]

• What is the mean (expected or measured) value of y?


From probability theory, this is simply:
<y> ≡ ∑kyk[Ω(E;yk)/Ω(E)]

• Clearly, to calculate this, we need to know both Ω(E)


& Ω(E;yk). This will be discussed in detail!
In principle (if we know the Ω’s) this calculation is easy.
• Example: 3 particles of spin ½ in an external magnetic field
again. Suppose that we know from measurement that the total
energy is E = - μH. As we’ve seen, there are only 3 accessible
states for E = - μH. These are: (+,+,-) (+,-,+) (-,+,+).
• Question: What is the probability of finding spin #1 in the “up”
position? In this case, Ω(E)  Ω(E = -μH) ≡ 3 and
Ω(E;yk)  Ω(E = -μH; spin 1 is “up”) ≡ 2
• So, the probability that spin #1 is “up” is:
P(spin 1 “up”) ≡ (⅔)
• Also, the probability that spin #1 is “down” is:
P(spin 1 “down”) ≡ (⅓)
• So, we can find answers to questions like: What is the
Mean (average) magnetic moment of spin # 1?

<μz> ≡ ∑kμzkP(μz= μzk)


(sum goes over k = “up” & “down”)
<μz> = P(#1 “up”)(μ) + P(#1 “down”)(-μ)
= (⅔)(μ) + (⅓)(-μ)

<μz> = (⅓)μ
Section 2.5: Behavior of the Density of States
• We’ve just seen that the probability that a system parameter y has
a value yk is:
P(y = yk) ≡ [Ω(E;yk)/Ω(E)]
Ω(E) ≡ number of accessible states with energy between E & E + δE.
 To do probability calculations,
we need to know the E dependence of Ω (E).
• Goal of this discussion: APPOXIMATE the E dependence of Ω(E).
– Discussion will not be mathematically rigorous. More of a “handwave”.
• Macroscopic System, f degrees of freedom, f is huge! f ~ 1024.
• The energy is in range E to E + δE. Of course, Ω(E)  δE.
Extrapolating the discussion of the 1d oscillator,
Ω(E) ~ the volume of phase space in this energy range.
Define Ω(E) ≡ ω(E)δE. ω(E) ≡ Density of states
• Our Goal: ESTIMATE E dependence of Ω(E) (or ω(E)).
~ How many accessible states are there for a macroscopic
(f ~ 1024) system at energy E?
– We aren’t interested in exact results.
– We want an order of magnitude estimate!!
– The result is an abstract, but very significant result.
• Let Φ(E) ≡ total # of quantum states for all energies E´ ≤ E.
• Consider 1 “typical” degree of freedom. ε ≡ energy associated with that
degree of freedom. Let φ(ε) ≡ total # of quantum states for this
degree of freedom.
• In general, φ(ε) increases with increasing ε. So, we can write:
φ(ε)  εα ( α ~ 1) (1)
φ(ε)  ε (1)
• Now, for the system, replace the energy E by an “average
energy” for a system of f degrees of freedom:
ε  (E/f) (2)
• f degrees of freedom &  φ(ε) states associated with each one.
 The total # of states associated with f degrees of freedom
≡ the product of the # associated with each one:
Φ(E)  [φ(ε)]f (3)
Now, use (1), (2), & (3) together:
 The total # of states for all energies E´ ≤ E is roughly:
Φ(E)  [φ(ε)]f  [E/f]f  Ef ≡ AEf (4)
A = constant
Ω(E) ≡ # accessible states with energy between E & E + δE
So, write: Ω(E) ≡ Φ(E + δE) - Φ(E); δE <<< E
 Now, expand Φ in a Taylor’s series & keep only the lowest order term:

 Ω (E)  (Φ/E)δE  [(Ef)/E]δE  Ef-1δE  Ef δE (f >> 1)

 Ω(E)  Ef δE (f ~ 1024)
 Ω(E) is a
RAPIDLY (!!)
increasing function of E!!!!!
• Briefly look at some numbers with powers of 10 in the exponent to get
a “feel for the “bigness” of Ω(E)  Ef :
1. The age of the Universe in seconds is “only”  1018 s!
1 with 18 zeros after it!
2. The Universe Volume divided by volume of a grain of sand
is “only”  1090!
1 with 90 zeros after it!
3. If f ~ 1024: By what factor does Ω (E) change when E changes by only 1%?
Let r ≡ [Ω (E + 0.01E)/Ω (E)] = (1.01)f, f ~ 1024
Evaluate r using logarithms: log10(r) = 1024log10(1.01)  4.32  1021
So, r  10x, where x  4.32  1021,
r ~ 1 with 4.32  1021 zeros after it!
Now, consider the same problem again, but let E increase by only 10-6E:
We get r  10y, where y = 4.34  1017,
r ~ 1 with 4.34  1017 zeros after it!
• Finally, consider 2 numbers:
C = 10u, u = 1024, D = 10v, v = 2  1024
• Finally, consider 2 numbers:
C = 10u, u = 1024 , D = 10v, v = 2  1024

• Are these similar numbers?? NO!!!


The ratio is: (D/C) = 10u,
• Finally, consider 2 numbers:
C = 10u, u = 1024 , D = 10v, v = 2  1024

• Are these similar numbers?? NO!!!


The ratio is: (D/C) = 10u,
 That is, D is 10u times (u = 1024!) larger than C
1 with 1024 zeros after it!
• Finally, consider 2 numbers:
C = 10u, u = 1024 , D = 10v, v = 2  1024

• Are these similar numbers?? NO!!!


The ratio is: (D/C) = 10u,
 That is, D is 10u times (u = 1024!) larger than C
1 with 1024 zeros after it!
• The bottom line is:
Ω(E) is an almost incomprehensibly
ENORMOUSLY RAPIDLY (!!)
increasing function of E!!!!!
Special Case: The Ideal, Monatomic Gas
• To make this clearer, consider a classical ideal, monatomic gas,
with N identical molecules confined to a volume V.
Calculate Ω(E) for this case.
Ideal  NO INTERACTION between molecules.
~ Valid for real gasses in the low density limit.
• In this simple case, the total energy E of the gas is the sum of the
kinetic energies of the N molecules, each of mass m:
E = (2m)-1∑(i = 1,N)(pi)2, pi = 3d momentum of particle i.
Ω(E) ≡ # of accessible states in energy interval E to E + δE
Ω(E) ≡ # of cells in phase space between E & E + δE.
Ω(E)  volume of phase space between E & E + δE.
Recall the 1d oscillator where Ω(E) = area between 2 ellipses.
Ω(E)  volume of phase space between E & E + δE.
2mE = ∑(i = 1,N)∑(α = x,y,z)(piα)2
The energy is independent of particle positions!
piα = α component of momentum of particle i.

Ω(E)  ∫ (E  E + δE) d3r1d3r2…d3rNd3p1d3p2…d3pN


6N dimensional volume integral!
• The limits E & E + δE are independent of the ri’s
 The position integrals for each ri can be done immediately:
∫d3ri ≡ V  ∫d3r1d3r2…d3rN ≡ VN
 Ω(E)  VN ∫ (E  E + δE) d3p1d3p2…d3pN
 Ω(E)  VN ∫ (E  E + δE) d3p1d3p2…d3pN (1)
Consider the sum  3N dimensional volume ----
integral in p space

2mE = ∑(i = 1,N)∑(α = x,y,z) (piα)2 (2)


(2) ≡ a “sphere” in 3N dimensional momentum space.
• Briefly consider case of 1 particle only. In that case, (2) is:
 2mE = (px)2 + (py)2 + (pz)2
This is a “sphere” in momentum space of “radius” R(E) = (2mE)½
For 1 particle, the 3d “sphere volume”  [R(E)]3  (E)(3/2)
 For N particles in 3N dimensional momentum space,
(2) ≡ a “sphere” of “radius” R(E) = (2mE)½
So, the 3N dimensional “sphere volume” is
 [R(E)]3N  (E)(3N/2)
• Lets write: Ω(E)  VNG(E)
where: G(E) ≡ ∫ d3p1d3p2…d3pN
(E  E + δE) (3)
G(E) ≡ volume of the “spherical shell” between E & E + δE
This is shown schematically for 2 dimensions in the figure:

G(E)  [R(E)]3N  (E)(3N/2)


 Ω(E)  VN(E)(3N/2)
Write: Ω(E) = BVN(E)(3N/2) , B = constant
Ω(E) = # of accessible states for an ideal gas
in the energy interval E to E + δE
Ω(E) = BVN(E)(3N/2)δE
= # accessible states of an ideal gas in the energy interval
E to E + δE, B = constant
• Now, in general, we found for f degrees of freedom,
Ω(E) = AEf δE
A = constant
• For the ideal gas, f = 3N, so we just got E(½)f.
• But, again, all of this was approximate & order of magnitude.
Don’t worry about the difference between f & (½)f.

You might also like