# Bosons and Fermions

As you saw in class, particles can be divided into two kinds: bosons and fermions. Bosons are particles that can occupy the same quantum state, and fermions are those that cannot occupy the same state. This distinction changes the way we compute the many-particle partition function in terms of the single-particle partition function, which we'll now work out in a toy example.

Recall the standard Boltzmann form of the many-particle partition function (canonical, not grand), $Z_N = \frac{Z_1^N}{N!}\,.$ The factor of $$N!$$ in the denominator symmetrizes over all the states when the particles are indistinguishable.

However, this factor is just an approximation to the true answer when we consider bosons and fermions. Consider a system with 10 possible states and 2 particles. The particles can either be in the same state or two different states.

• How many states are there for two distinguishable particles.
• How many are left if they are "Boltzons", classical particles obeying the (approximate) Boltzmann rule for counting?
• How many states are accessible to two bosons?
• How many states are accessible to two fermions?

You should conclude from the above calculations that the Boltzmann many-particle partition function interpolates between the boson and fermion many-particle partition function.

• If each allowed microstate is occupied with equal probability, what is the probability that the two bosons will occupy the same state?

Let's try a slightly more sophisticated version of this counting. Consider a harmonic oscillator potential with evenly spaced, nondegenerate energy levels, and consider that there are five particles present in this system. For each of the three cases of distinguishable, boson, or fermion particles, we can ask the following questions:

• What does the ground state look like? That is, which states are occupied by which particles when the system is in its lowest energy configuration?

For distinguishable particles or bosons, the ground state is occupied as follows: $5\ 0\ 0\ 0\ 0\ 0\$ while for fermions, each particle must occupy a distinct energy level, so the occupation numbers look like: $1\ 1\ 1\ 1\ 1\ 0\ ,$ where I've drawn the energy levels from left to right in increasing order.

• Suppose that the system has one unit of energy. Describe the structure of the first excited state(s) for each of the three cases.

For bosons or distinguishable particles, we simply promote one of the particles to the first excited state, so the occupation numbers are given graphically by $4\ 1\ 0\ 0\ 0\ 0\ 0\,$ while for fermions they are given by $1\ 1\ 1\ 1\ 0\ 1\ 0\,.$ Importantly, there are five degenerate states for the case of distinguishable particles. This is because we could have chosen any one of the particles to promote to the first excited state. For bosons, there is only one first excited state because it is meaningless to talk about which boson got promoted. For fermions, there is also only one way that we could have obtained that occupancy configuration because of indistinguishability of the particles.

• What are the occupancies and degeneracies for the case of two units of energy? For three units?

For all three types of particles, there are two different occupancy numbers that give two units of energy. For distinguishable and bosons, these look like $4\ 0\ 1\ 0\ 0\ 0\ 0\,$ $3\ 2\ 0\ 0\ 0\ 0\ 0\, .$ For fermions, they look like $1\ 1\ 1\ 0\ 1\ 1\ 0\,$ $1\ 1\ 1\ 1\ 0\ 0\ 1\, .$ Importantly, we can compute the degeneracy of each of these configurations again. For distinguishable particles, there are $${5 \choose 1}$$ ways to get the first configuration, and $$5 \choose 2$$ ways to get the second, for a total of 15. For bosons and fermions, again, there is only one way to get each of their respective configurations. I'll leave three units of energy as an exercise.

• How will the behavior of a bosonic system differ from that of a system of distinguishable particles at a small (low but nonzero) temperature?

## Quantum statistics and quantum volume

When the number of single-particle states is much larger than the total number of particles, then the distinction between bosons, fermions, or "boltzons" (classical indistinguishable particles) becomes irrelevant; it is simply too unlikely that a significant fraction of particles try to overlap in the same state in this limit, so the distinction is moot. This will be the case whenever $$Z_1 \gg N$$, i.e. when the single-particle partition function is large compared to the number of particles.

For an ideal gas, the single particle partition function depends on the particle density and the internal partition function like $Z_1 = \frac{V Z_{\textrm{int}}}{v_Q}\,,$ where you'll recall that the quantum volume is given by $v_Q = \biggl(\frac{h^2}{2\pi mkT}\biggr)^{3/2}\,.$ Therefore, the transition between classical statistics and quantum statistics takes place when the density of particles is high enough that their quantum volume becomes comparable to the actual physical volume occupied per particle, weighted by the number of interal states.

• Consider an ideal gas of Nitrogen molecules at standard pressure. To roughly what temperature do we need to cool this system (at constant pressure) before quantum effects become important?
h = 6.63e-34; % J s
k = 1.38e-23; % J/K
P = 1.013e5; % 1 atm in Pa
c = 1.66e-27; % kg per amu
mN = 2*14*c; % mass of N_2 in kg

vQ = @(T) (h^2 ./ (2*pi*mN*k*T)).^(3/2); % quantum volume
VonN = @(T) k*T/P; % density of an ideal gas

ratio = @(T) VonN(T)./vQ(T);
ezplot(ratio,[0,2]);


As we cool this system at constant pressure, eventually the density gets so large that the physical volume per particle is comparable to the quantum volume, and that is when we expect Boltzmann statistics to breakdown and quantum statistics to become important. From the graph, our rough estimate of this transition point is about .6 K. Note that this rough estimate neglects the contributions from the internal partition function.

• What is the crossover temperature for an ideal gas of Helium molecules at standard pressure?

Last class, you saw how the spectrum of electromagnetic radiation was distributed in thermal equilibrium. In particular, you considered the EM radiation inside a box with side length $$L$$ and looked at the distribution of frequencies when at thermal equilibrium at a temperature $$T$$. By assuming that the EM energy was quantized into units of $$hf$$, you found that the partition function was $Z = \frac{1}{1-\e^{-\beta h f}}\,.$ This implies that the average energy per unit is distributed according to $\bar{n}_{\text{Pl}} = \avg{n} = \frac{\avg{E}}{hf} = \frac{1}{\e^{h f/kT}-1}\,,$ which is called the Planck distribution. This is a special case of the Bose-Einstein distribution when the chemical potential is set to zero, which can be regarded as a manifestation of the fact that photons can be either created or destroyed in any number whatsoever.

We can also ask about the total number of photons and the total energy inside the box. To do this, we need to count all of the modes and weigh them accordingly. As a warmup, consider the case of 1D. Here the wavelength of mode m inside the box is given by $\lambda_m = \frac{2L}{m}\,,$ and the energy and momentum relations are simply $p_m = \frac{hm}{2L} \quad , \quad \eps = hf = \frac{hcm}{2L}\,.$ In 3D, we use the generalization of this rule, and we find that the allowed energies are $\eps = \frac{hc}{2L}\sqrt{m_x^2+m_y^2+m_z^2} \,.$ We can simplify this expression to make it look more like the previous one if we adopt the shorthand notation $\eps = \frac{hcm}{2L}\quad , \quad m = |\vec{m}|\,.$

Now the total energy is given by a sum over all modes, namely $U = 2 \sum_{m_x,m_y,m_z=1}^\infty \eps \bar{n}_{\text{Pl}}(\eps) = 2 \sum_m \frac{hcm}{2L} \frac{1}{\e^{hcm/2LkT} -1}\,.$ Here the factor of 2 comes from the sum over both polarization degrees of freedom per mode. As you saw with the density of states, we can turn this sum into an integral as follows, $U = \int_0^{\infty}\dd m \int_0^{\pi/2} \dd\theta \int_0^{\pi/2} \dd\phi(m^2 \sin \theta) \frac{hcm}{L}\frac{1}{\e^{hcm/2LkT} -1}$ The integral runs only over the positive orthant of the sphere. If we change variables, we can do the simple angular integrals and rewrite the remaining radial integral in terms of the energy as follows $U = L^3 \int_0^\infty \dd\eps \frac{8\pi\eps^3/(hc)^3}{\e^{\eps/kT}-1}\,.$ Dividing by the volume on each side, we can rewrite this as $\frac{U}{V} = \int_0^\infty u(\eps) \dd\eps \,,$ where now the integrand is the energy density per unit photon energy. That is, $$u$$ is the spectrum of the photons. It is worth writing it again separately: $u(\eps) =\frac{8\pi}{(hc)^3}\frac{\eps^3}{\e^{\eps/kT}-1}\,.$ By changing variables, we can alternatively think of the spectrum as being a function either of the energy, the frequency, or the wavelength accordingly.

As usual, we can introduce a dimensionless variable $$x$$ to help us study this quantity, $x = \frac{\eps}{kT}\,,$ and now the energy density becomes $\frac{U}{V} = \frac{8\pi(kT)^4}{(hc)^3} \int_0^\infty \dd x\frac{x^3}{\e^{x}-1} = \frac{8\pi^5(kT)^4}{15(hc)^3}\,.$

The final answer is in the last expression; the evaluation of the integral is not obvious, but is shown in Appendix B of the textbook. However, it is more important to notice two features about this expression. First, the integrand is still proportional to the Planck distribution, hence we can still interpret it as a probability density. That is, the probability of finding a mode with energy between $$x$$ and $$x+\dd x$$ is proportional to the density. Second, the energy density is proportional to the fourth power of $$kT$$.

The high-frequency (short-wavelength) behavior of the spectrum is accurately captured by Wien's law, which predated Planck's derivation of the correct blackbody spectrum. It is just the lowest-order approximation to the blackbody spectrum.

Let's make a plot of the spectrum as a function of photon frequency (or energy), save for the overall factor of $$8\pi/(hc)^3$$ which just serves to rescale the vertical axis.

u = @(x) x.^3./(exp(x)-1);
ezplot(u,0,12);


The peak value of this function occurs at $$x = 2.82$$, or equivalently $$hf = 2.82 kT$$. Thus, if you measure the relative intensity of radiation as a function of frequency for a blackbody and find the maximum, this will tell you the temperature.

If we add the dimensions back into the plot, then we can see the behavior of the blackbody spectrum as we change the temperature.

• Try to reproduce in Matlab the following plot of the Planck spectrum for various values of the temperature. Here they have divided out an overall constant factor, so don't worry about matching the vertical axis.
• Light sources are sometimes sold according to their "temperature", which does not correspond to the physical temperature of the source, but rather the temperature of the equivalently radiation blackbody. Which color light would be more appropriate for the lighting in a romantic Italian restaurant, 3000 K bulbs or 6000 K bulbs?

The sun is a blackbody and has a surface temperature of about 5800 K.

• At the sun's surface, how much energy is in the EM radiation within a cubic meter?
• What fraction of this energy is in the visible portion of the spectrum, with wavelengths between 400 nm and 700 nm?
C = 4.6356*1e-18; % k^4/(hc)^3 in units: J/(m^3 K^4)
T = 5800; % Kelvin
U = 8*pi^5*C*T^4/15 % energy in Joules

U =

0.8562