*[The following is a guest post from Bjoern Malte Schaefer. Bjoern is one of the curators of the Cosmology Question of the Week blog, which is worth checking out. This post is a historical look at some of the early parts in the history of quantum mechanics, in particular, the black-body spectrum. Questions are welcome and I'll make sure he sees any of them. Image captions (and hyper-links, in this case) are, as usual, by me, because guest posters don't ever seem to provide their own.]*

**Two unusual systems**

Quantum mechanics surprises with the statement that the microscopic world works very differently from the macroscopic world. Therefore, it took a while until quantum mechanics was formally established as the theory of the microworld. In particular, despite the fact that two of the natural systems on which theories of quantum mechanics could initially be tested were very simple, even from the point of view of the physicists of the time, one needed to introduce a number of novel concepts for their description. These two physical systems were the hydrogen atom and the spectrum of a thermal radiation source. The hydrogen atom was the lightest of all atoms with the most simply structured spectrum. It exhibited many regularities involving rational numbers relating its discrete energy levels. It could only be ionised once implying that it had only a single electron and from these reasons it was the obvious test case for any theory of mechanics in the quantum regime. Werner Heisenberg was the first to be successful in solving this quantum mechanical analogue of the Kepler-problem, i.e. the equation of motion of a charge moving in a Coulomb-potential, paving the way for a systematic understanding of atomic spectra, their fine structure, the theory of chemical bonds, interactions of atoms with fields and ultimately quantum electrodynamics.

The Planck-spectrum was equally puzzling: It is the distribution of photon energies emitted from a body at thermal equilibrium and does not, in particular, require any further specification of the body apart that it should be black, meaning ideally emitting and absorbing radiation irrespective of wave length: From this point of view it is really the simplest macroscopic body one could imagine because its internal structure does not matter. In contrast to the hydrogen atom it is described with a continuous spectrum. In fact, there are at least two beautiful examples of Planck-spectra in Nature: the thermal spectrum of the Sun and the cosmic microwave background. The solution to the Planck-spectrum involves quantum mechanics, quantum statistics and relativity, and unites three of the four the great constants of Nature: the Planck-quantum h, the Boltzmann-constant \(k_B\) and the speed of light c.

**Limits of the Planck-spectrum**

Although criticised at the time by many physicists as phenomenological, the high energy part of the Planck-spectrum is relatively straightforward to understand, as had been realised by Wilhelm Wien: Starting with the result that photons as relativistic particles carry energies proportional to their frequency as well as momenta inversely proportional to their wave length (the constant of proportionality in both cases being the Planck-constant h), imposing isotropy of the photon momenta and assuming a thermal distribution of energies according to Boltzmann leads directly to Wien's result which is an excellent fit at high photon energies but shows discrepancies at low photon energies, implying that at low temperatures the system exhibits quantum behaviour of some type.

Wien had a second result: It was known experimentally that the location of the maximum of the spectrum in terms of photon energy is proportional to the temperature of the radiating body. While intuitively this makes a lot of sense (because hotter bodies emit more energetic radiation) and followed from Wien's calculation, Wien could not quite make sense of the numerical pre-factor which of course differs because of the unknown functional shape of the spectrum at low energies. And in fact Wien had a third result: The total energy radiated depends on the temperature taken to the fourth power. This result follows as well from Wien's formula for the spectrum, and one could guess the natural constants involved by dimensional analysis, but the prefactor is again off by a little, underlining that there is something fundamental amiss at low energies.

Wilhelm Wien looking like a pretty typical turn of the 20th century man posing for a photo. |

**Statistics at low energies**

While the solution to the hydrogen atom lies in the correct equation of motion for the electron with all formal overhead needed in quantum mechanics, the solution to the Planck-spectrum needed the insight that quantum mechanical particles required a different type of statistics which asymptotically recovers classical statistics at high energies. Clearly, in the case of non-interacting photons (electrodynamics is a linear theory!), an equation of motion can not provide the solution.

A textbook would now state that photons as quantum mechanical particles are identical and indistinguishable, which to me was very confusing when I first read it, but the two adjectives refer to very specific properties of photons. Surely they are identical in the sense that they're excitations of the electromagnetic field: They share physical properties such as polarisation, energy and momentum and travel always at the same speed. Indistinguishable means something different: You can *not* follow the trajectory of a quantum mechanical particle in the same way as you could with a classical particle. All one can do is localise particles at a given instant, and localise them again at a later time, but there is no way of telling which particle from the first localisation has moved to which position at the second localisation, and in fact there is interference between both paths.

This effect is relevant if the typical particle separation is small compared to a length scale defined by quantum mechanics, which is defined as the wave length of a photon with an energy corresponding to the thermal energy. Highly energetic photons are rare and separated by large distances, so quantum mechanical interference does not play a role and the photons behave classically, in contrast to low-energy photons, of which there are plenty, and which are separated by small distances. For these tightly packed photons quantum interference matters and indistinguishability becomes important.

Constructing statistics from indistinguishable particles is now a bit like drawing from an urn with replacement, while considering all draws of a certain number of photons as equivalent. This alteration of statistics is relevant at energies small compared to the thermal energy of the system, while at high energies the system behaves purely classical, giving rise to Wien's results.

Additionally, the new statistics yields explanations for very puzzling numbers that Wien could not make sense of: They appear as values of the Riemann zeta-function (a very enigmatic function with fascinating properties), and which are, in contrast to the rational numbers occurring in the hydrogen problem, irrational.

**Pauli's principle and the Hanbury Brown-Twiss-experiment**

In what way exactly do photons now interfere? Statistically, photons prefer to be "bunched" which is a consequence of a new symmetry discovered by Wolfgang Pauli and *not* because of their dynamics (after all, they are still non-interacting as electrodynamics is linear). Quantum mechanical systems react in a specific way if one exchanges particles that make up the system, which in fact is very relevant in between successive localisation steps as discussed before. Nature is quite capricious when it comes to this point, as noticed by Pauli, as there are only two types of particles. The first family, called bosons, exhibit a constructive interference between realisations with interchanged particles. Photons belong to this family and their bunching is explained in the fact that it is overall more likely due to constructive interference to find them in identical states. (The second family is called fermions, which show destructive interference between realisations with interchanged particles, and one example of this group is the electron.) It is worth noting that Max Planck himself solved the problem with purely thermodynamical stability arguments without providing a statistical description which is due to Satyendra Nath Bose and Albert Einstein.

The bunching of photons has been in fact observed in an ingenious experiment by Robert Hanbury Brown and Richard Q. Twiss, who showed that after observing a photon from a thermal source it is statistically more likely to observe a second photon with similar properties. This not only means that there can be arbitrarily many photons in a single statistical state, but further that photons like being in the same states due to positive interference if interchanged with their partners (and not because of interaction!).

**Summary**

**The Planck-spectrum and the hydrogen atom were elementary to the formulation of quantum mechanics. The solution to the Planck-spectrum involved a new type of statistics which was required by the indistinguishability of quantum mechanical particles and Pauli's exchange symmetry. And it's a nice example that quantum mechanics can become relevant in unusal places: Imagine, about 40% of the energy of the Sun is carried by photons from the quantum mechanical part of the spectrum, which is a nice thought on a bright sunny day!**

Do you have a reference for the source of the figure of the Planck distribution you show? I'd like to use it.

ReplyDeleteIt's from the wikipedia page for the Sun.

DeleteHere's a link

DeleteDo you have a reference for the source of the figure of the Planck distribution you show? I'd like to use it.

ReplyDelete