Web site navigation : home > Quantum Theory
Waves and Particles Probability and Wave Functions The Quantum Atom Development of Quantum Theory Current Research and Applications 

Before the development of quantum theory, physicists assumed that, with perfect equipment in perfect conditions, measuring any physical quantity as accurately as desired was possible. Quantum mechanical equations show that accurate measurement of both the position and the momentum of a particle at the same time is impossible. This rule is called Heisenberg’s uncertainty principle after German physicist Werner Heisenberg, who derived it from other rules of quantum theory. The uncertainty principle means that as physicists measure a particle’s position with more and more accuracy, the momentum of the particle becomes less and less precise, or more and more uncertain, and vice versa. Heisenberg formally stated his principle by describing the relationship between the uncertainty in the measurement of a particle’s position and the uncertainty in the measurement of its momentum. Heisenberg said that the uncertainty in position (represented by ?x) times the uncertainty in momentum (represented by ?p;) must be greater than a constant number equal to Planck’s constant (h) divided by 4p (p is a constant approximately equal to 3.14). Mathematically, the uncertainty principle can be written as ?x ?p > h / 4p. This relationship means that as a scientist measures a particle’s position more and more accurately—so the uncertainty in its position becomes very small—the uncertainty in its momentum must become large to compensate and make this expression true. Likewise, if the uncertainty in momentum, ?p, becomes small, ?x must become large to make the expression true. One way to understand the uncertainty principle is to consider the dual waveparticle nature of light and matter. Physicists can measure the position and momentum of an atom by bouncing light off of the atom. If they treat the light as a wave, they have to consider a property of waves called diffraction when measuring the atom’s position. Diffraction occurs when waves encounter an object—the waves bend around the object instead of traveling in a straight line. If the length of the waves is much shorter than the size of the object, the bending of the waves just at the edges of the object is not a problem. Most of the waves bounce back and give an accurate measurement of the object’s position. If the length of the waves is close to the size of the object, however, most of the waves diffract, making the measurement of the object’s position fuzzy. Physicists must bounce shorter and shorter waves off an atom to measure its position more accurately. Using shorter wavelengths of light, however, increases the uncertainty in the measurement of the atom’s momentum. Light carries energy and momentum, because of its particle nature (described in the Compton effect). Photons that strike the atom being measured will change the atom’s energy and momentum. The fact that measuring an object also affects the object is an important principle in quantum theory. Normally the affect is so small it does not matter, but on the small scale of atoms, it becomes important. The bump to the atom increases the uncertainty in the measurement of the atom’s momentum. Light with more energy and momentum will knock the atom harder and create more uncertainty. The momentum of light is equal to Planck’s constant divided by the light’s wavelength, or p = h/?. Physicists can increase the wavelength to decrease the light’s momentum and measure the atom’s momentum more accurately. Because of diffraction, however, increasing the light’s wavelength increases the uncertainty in the measurement of the atom’s position. Physicists most often use the uncertainty principle that describes the relationship between position and momentum, but a similar and important uncertainty relationship also exists between the measurement of energy and the measurement of time. 