In part I of this series, columnist David Ball laid the groundwork for why the scientific understanding of nature in the late 19th century was found wanting: it could not explain a variety of phenomena that scientists were examining. (One of these phenomena was spectroscopy itself!) In this installment, he reviews the paradigm shifts in science that preceded the development of the currently accepted theories of the nature of matter. It all starts with the nature of light.
Blackbody radiation, spectroscopy (especially of hydrogen), the photoelectric effect, low-temperature heat capacities — these phenomena were a puzzle to scientists of the 19th century. Well-entrenched theories of nature, including Newton's laws of motion and Maxwell's laws of electrodynamics, which were so successful in understanding the behavior of matter and radiation, did not help explain these behaviors completely.
David W. Ball
As I've said before, if there is a disagreement between theory and nature, we've got to change either nature or theory. Because every attempt to change the universe has failed, our only choice is to get a new theory. That's what eventually happened, and it started with a thermodynamicist.
Max Planck (1858–1947) was a German theoretical physicist who was trained in thermodynamics. This was fortunate, for when Planck began to consider the issue involving blackbody radiation, he realized that Wien's law (1) would apply only if the entropy of the light depended upon its energy. Planck realized that if this were true in the high-frequency portion of the electromagnetic spectrum (where Wien's law applied), it had to be true in the low-frequency portion of the spectrum, where the Rayleigh-Jeans law was approximately correct.
Figure 1
Thus, Planck sought to combine the two laws into one mathematical framework. He was able to do this and derived a formula that did predict the nature of blackbody radiation — but he realized that this mathematical expression had to have some physical justification. So Planck looked into what physical basis was needed to justify his equation. Apparently, he did not like the conclusions he arrived at: that entropy was a statistical concept, not a deterministic or absolute one. In addition, Planck had to assume that the vibrating atoms in the blackbody could not absorb energy continuously, but only in certain amounts that were proportional to their vibrational frequency ν, not their amplitude A (Figure 1):
To make this proportionality an equality, a proportionality constant is needed:
where h is now known as Planck's constant.
The formula for the intensity of blackbody radiation that Planck derived was
where h is Planck's constant, c is the speed of light, k is Boltzmann's constant, and T is the absolute temperature. This equation is known by several names: Planck's law, the radiation distribution law, the law of blackbody radiation, and so forth. In terms of wavelength λ, Planck's law is
Figure 2 shows a plot of the intensity of blackbody radiation versus wavelength at several temperatures. Because the energy of the vibrating atoms is restricted to a specific quantity based upon their vibrational frequencies, we say that the energy is quantized. Hence, these concepts are known as quantum theory.
Figure 2
It is widely recognized that Planck himself was rather troubled by the concept of quantization of energy, despite the fact that it led to a mathematical formula that agreed with experimental measurements. In the past, there was no need to assume that an energy value must be restricted. The kinetic energy of a moving body, for example, could have any value, because the velocity of a moving body could have any value:
Because the velocity v could vary smoothly from 0 to (technically) infinity, so too could the kinetic energy vary smoothly. Having been educated in classical science all his life (Planck was 42 when he announced quantum theory), it was mentally difficult for him to adjust to a radically new way of thinking. Ultimately he was convinced, mostly because his belief in the evidence was so strong.
Progress was slow after Planck announced his quantum theory in October of 1900 (and later published it in Annalen der Physik [2]). Although it was recognized that the equation he derived worked, there still was no reason to believe that there was any natural justification for his assumption that energy was proportional to frequency.
This changed in 1905, when Albert Einstein suggested in another paper in Annalen der Physik (3) that light itself had an energy that was directly proportional to its frequency:
This was the break Planck's quantum theory needed: if quantized energy was the proper way to describe a natural phenomenon, then Planck's theory had some physical basis and wasn't just a mathematical trick.
How did Einstein support this contention? He used it to explain the photoelectric effect, a phenomenon discovered in the 19th century but heretofore unexplained (1). Recall that the photoelectric effect is when light is shined on a metal surface in a vacuum, and electrons are emitted. The number of electrons emitted is related to the intensity of the light, and the kinetic energy of the emitted electrons is related to the frequency of the light used; furthermore, light below a certain frequency did not cause electrons to be emitted. Einstein argued that if we assume that the energy of light was proportional to its frequency, he could explain the photoelectric effect.
Einstein assumed that a metal held an electron with some characteristic amount of energy that we now call the work function of the metal, represented as Φ. If an electron were to absorb a light wave, the electron must absorb at least this much energy from the light to escape from the metal. Any additional energy can go into the kinetic energy of the emitted electron. The equation that Einstein deduced was
where me is the mass of the electron and v is the velocity of the ejected electron. (Don't let the earlier equation fool you: the letter on the left side is the Greek letter "nu," while the final variable on the right side is the Latin letter "vee." They look very similar.) Thus, a minimum frequency (energy) of light is needed to overcome the work function Φ, with any remaining energy going into kinetic energy. Careful measurements of the photoelectric effect demonstrated that this indeed was what occurred, providing some evidence in favor of the argument that the energy of light was quantized.
Essentially, the quantum theory of light suggests that a light wave is acting as a particle of energy; the name "photon" was eventually coined for a particle of light. This interpretation re-ignited the "particle versus wave" debate about the nature of light (4). Probably the best way to consider the situation is to recognize that the two descriptions are no longer mutually exclusive — as we shall find out for matter as well.
Science recognizes that the concept of quantization is so important to understanding nature that pre-1900 physics is termed Classical Physics and post-1900 physics is called Modern Physics.
The particulate nature of light was supported in 1923 by the discovery of the Compton effect by Arthur Compton. Essentially, a high-energy photon can lose energy by colliding with subatomic particles like electrons. This occurs because the photon loses momentum — another property of a particle — during the collision.
Earlier, in 1907, Einstein demonstrated the importance of quantized energy by demonstrating that energy quantization could explain the variation of heat capacity of materials at temperatures close to absolute zero. Although a formula Einstein derived did not agree well with experimental data — Peter Debye derived a much more accurate model in 1912 — it did demonstrate that quantum effects were useful in explaining natural phenomena.
In 1913, Danish physicist Niels Bohr assumed that a different quantity, angular momentum, might be quantized as well. With that assumption and some resulting algebra, Bohr was able to derive an equation that could explain the lines that appeared in the spectrum of the hydrogen atom — another one of those issues that Classical Science was unable to explain. The equation he derived was
where all variables have been defined previously except for the charge on the electron e and the permittivity of free space ε0. The variables n1 and n2, both of which are squared, are called quantum numbers and relate to the amount of angular momentum in each electron's orbit about the hydrogen atom nucleus. The variable λ on the left side was the wavelength of light found in the spectrum of the hydrogen atom.
While successful, Bohr's equation was limited only to the hydrogen atom (and any other one-electron system, such as Li2+ ). Ultimately, a more successful model was needed — but Bohr did demonstrate that quantities other than energy could be thought of as quantizable.
Finally, in 1923, French physicist Louis de Broglie suggested that matter acted as waves. By comparing Einstein's mass–energy equivalence with Planck's quantized energy equation, de Broglie deduced the expression
where λ is now the wavelength of a particle of matter. Notice that λ, called the de Broglie wavelength, is inversely proportional to mass, so for large (that is, macroscopic) masses, the wavelength is negligible. However, for small masses like electrons, the wavelength is considerable: an electron moving at a velocity of 2.00 × 106 m/s has a de Broglie wavelength of almost 4 Å — four times larger than a hydrogen atom! As such, the wave nature of matter cannot be ignored at the atomic level, and any proper description of electron behavior must take into account their behavior as waves. Experimental verification was not far off; in 1927 at Bell Labs, Clinton Davisson and Lester Germer diffracted a beam of electrons with a crystal of nickel. Because only waves can diffract, this demonstrated that electrons behaved as waves.
A new theory of the behavior of matter at the atomic level is needed. In the next installment, we will discuss this new theory and explore how it affects spectroscopy — our main goal all along.
Both Planck's and Einstein's papers are available as scanned images in the original German at http://www.gallica.bnf.fr/scripts/catalog.php?IdentPerio=NP00025, which is an archive of Annalen der Physik from 1799 to 1936.
David W. Ball is a professor of chemistry at Cleveland State University in Ohio. Many of his "Baseline" columns have been reprinted in book form by SPIE Press as The Basics of Spectroscopy, available through the SPIE Web Bookstore at www.spie.org. His most recent book, Field Guide to Spectroscopy (published in May 2006), is available from SPIE Press. He can be reached at d.ball@csuohio.edu his website is www.academic.csuohio.edu/ball.
(1) D.W. Ball, Spectroscopy 22(12), 91–94 (2007).
(2) M. Planck, Ann. der Phys. 4, 553–563 (1901).
(3) A. Einstein, Ann. der Phys. 17, 132–148 (1905).
(4) D.W. Ball, Spectroscopy 21(6), 30–33 (2006).
Raman Spectroscopy and Deep Learning Enhances Blended Vegetable Oil Authentication
December 10th 2024Researchers at Yanshan University have developed a groundbreaking method combining Raman spectroscopy and deep learning models to accurately identify and quantify components in blended vegetable oils.
Best of the Week: Lithium-Ion Battery Analysis, Reviving Retired Spectrometers, Preserving Wetlands
December 6th 2024Top articles published this week include a review of lithium-ion batteries, a news article about portable near-infrared (NIR) spectroscopy, and a look at using imaging techniques to preserve the wetlands.