The Imprecision of Heisenberg’s Microscope

Back to Contents

    In 1927 Werner Heisenberg published "On the Descriptive Content of Quantum Theoretical Kinematics and Mechanics", the paper in which he first presented the indeterminacy principle associated with his name. Near the beginning of Section 1 of the paper Heisenberg stated one of the tacit axioms of physics in a somewhat restricted form: "If one would become clear over what is to be understood by the words ‘position of an object’, for example of an electron (relative to a given reference system), then one must assert specific experiments, with the help of which one has thought to measure the ‘position of the electron’; otherwise these words have no meaning." He then asserted just such an experiment using what he called a gamma-ray microscope, noting that he based the concept on the Compton effect, the scattering of x-ray photons by electrons, which scattering effect Arthur Compton had discovered in 1923. He then noted, before going into a fully detailed discussion of his subject, that the commutation relation,

(Eq’n 1)

from his matrix mechanics formulation of the quantum theory entails that the precision in the measurement of the electron’s momentum and the precision in the measurement of the electron’s position necessarily conform to the relation

(Eq’n 2)

Today we write that relation, expressing Heisenberg’s indeterminacy principle, as

(Eq’n 3)

    The axiom that Heisenberg used in his paper seems to reflect George Berkeley’s doctrine of immaterialism: "To be is to be perceived," Berkeley proclaimed as he denied the existence of matter as a thing-in-itself, a thing with its own permanent existence based on something like Plato’s Forms. In Berkeley’s doctrine a thing can exist if and only if some witness perceives it. So imagine that you have gone for a walk in a forest and you come across a freshly fallen tree. I might ask you whether the tree made a sound when it fell. You assert that of course it did. Even if at the time it fell no one stood close enough to hear the sound? Yes, you reply, taking Berkeley’s position, because there was actually a witness present to perceive the sound and all of the other activity involved in a tree falling in a forest. And in that statement you see the basic idea behind Berkeley’s proof for the existence of God.

    The physicists’ version of the axiom tells us that "to exist a thing must be perceivable"; in other words, if some property exists, there must also exist, at least in potentia, an experiment that we can perform to detect and measure that property. Otherwise, as Heisenberg noted, the word that denotes that property has no legitimate referent and, therefore, no meaning. Note that we can take the bare fact that a particle exists as a property of that particle, one suitable for detection with appropriate experiments. Consider two examples.

    Since physicists determined, at the beginning of the Nineteenth Century, that light has a wave nature, they hypothesized the existence of a medium, the æther, which light required for propagation. Based on previous experience, both personal and vicarious, they assumed that any wave phenomenon needs something to do the waving. Even though evidence continued to accumulate across the 1800's to support the theory of light as a wave phenomenon, nobody could devise an experiment that would detect any sign of the æther – not Michelson and Morley in 1887, nor Trouton and Noble in 1903, nor anyone else.

    On the other hand, in 1911 physicists found what they were not seeking. In 1909 Hans Geiger (1882 Sep 30 – 1945 Sep 24) and Ernest Marsden (1889 Feb 19 – 1970 Dec 15), working under the supervision of Ernest Rutherford (1871 Aug 30 – 1937 Oct 19), performed an experiment that involved projecting thin, narrow beams of alpha-rays or beta-rays at metal foils. Rutherford had already discovered that he could detect alpha particles because whenever an alpha particle hits a thin film of zinc sulfide it produces a minuscule flash of light. And Rutherford knew that he and his team could explore the structure of matter by projecting alpha-rays through thin foils and then mathematically teasing the desired knowledge out of the pattern found in the data describing the deflection imposed on the alpha-rays as they passed through the foil. So Geiger and Marsden spent hours at a time in a darkened laboratory peering through microscopes and counting the faint twinklings flickering on the little zinc sulfide screens that they positioned near the foil through which they shot their alpha-rays.

    In the model of the atom accepted at the time, highly energetic alpha-rays would have passed more or less straight through any foil, as almost all of Geiger and Marsden’s alpha-rays did. That model described an atom as a thin cloud of positive electric charge in which the negatively-charged electrons flittered, distributed like the raisins in a plum pudding, to use the metaphor of the day. A relatively massive, highly energetic particle, like those that comprise alpha-rays, would punch through such fluffy atoms with extremely little deflection. But Geiger and Marsden found that a small fraction, less than one-tenth of a percent, of the alpha-rays that they detected got deflected through angles greater than ninety degrees. In 1911 Rutherford inferred that the anomalous deflections came from alpha particles that had undergone more or less head-on collisions with tiny particles containing all of an atom’s positive electric charge and almost all of its mass. He asserted that every atom consists of one such nucleus enveloped in a swirling cloud of electrons.

    The success of Geiger, Marsden, and Rutherford’s work led to experiments involving high-energy particle collisions dominating Twentieth-Century physics much as experiments with electric currents had dominated much of Nineteenth-Century physics. Prior to the invention of the cyclotron in 1930 x-rays offered one of the best high-energy radiations available to physicists. In 1923 Arthur Compton (1892 Sep 10 – 1962 Mar 15) reported on experiments he had carried out the previous year, in which experiments he had projected x-rays, obtained by shooting electrons at a molybdenum target, through graphite and made measurements on the scattered x-rays.

    In his 1923 paper, "A Quantum Theory of the Scattering of X-Rays by Light Elements", Compton described the difficulties he had encountered when he tried to use classical electromagnetic theory to calculate a description of x-rays scattered off the electrons in matter. Using what we now call the Old Quantum Theory, he devised an alternate description and found that the x-rays in his experiment conformed to it closely enough that he could assert what we now call Compton scattering as a description true to Reality. Contrary to what I long believed, he did not make any measurements on the scattered electrons, because he couldn’t: as he noted in his paper, the electrons that he would have to measure intermingled with electrons coming from other processes, such as the photoelectric effect, occurring in the graphite that he used as a target. He validated the theoretical result for electrons by combining measurements that he made on the scattered x-rays with the conservation laws pertaining to energy and linear momentum, thereby creating a neat little piece of theory that Heisenberg used in conceiving his gamma-ray microscope.

    Heisenberg’s imaginary experiment went something like this: we set up a single-lens microscope on the positive y-axis of a coordinate grid in such a way that one of the lens’s foci lies on the grid’s origin. Somewhere on the negative x-axis (to our left) we establish a source of gamma rays and appropriate shields to ensure that only those gamma rays traveling along the x-axis will cross the y-axis (at the origin) while somewhere on the positive x-axis we have set up an electron gun that shoots electrons at the origin. Behind the microscope, at the second focus, we have a screen that produces a flash of visible light when a gamma photon strikes it. We have thus set up in our minds a combination of the Rutherford and Compton experiments. Note that we have tacitly ascribed to the lens the "magic" property of refracting gamma rays just as an ordinary lens refracts visible light.

    We see a flash of light on the detector screen, right at the focus of the microscope’s lens. From that datum we infer that a gamma photon struck an electron right at the origin of our grid, got deflected through a perfect right angle, and passed through our microscope. And from that fact we extract perfect knowledge of the electron’s linear momentum: before the collision the electron had to travel along the x-axis in the negative x-direction with as much momentum as the gamma photon carried and after the collision the electron traveled along the negative y-axis with the same amount of linear momentum. In both the before and after aspects of the collision the linear momenta of the photon and the electron had to add up to zero for the perfect right-angle collision and rebound to occur. So we can obtain perfect knowledge of the electron’s position and linear momentum at some given instant.

    That naive analysis looks a little too Newtonian. We have subtly conceived our photon and electron as having the character of billiard balls colliding on a smooth, flat surface. For a proper quantum analysis we must acknowledge the wave nature of the particles involved. The that end Heisenberg had to introduce the microscope, using its lens to ensure that only gamma rays originating at the focus on the origin of the grid (through collisions with electrons) would get projected onto the other focus, where the detector sat. The experiment won’t work at all without the microscope, certainly, but it doesn’t work well with it either.

    If the microscope’s lens has a radius of r and a focal length of f, then it will accept and pass all photons emanating from the origin of the system so long as they follow paths that make an angle of less than è with the y-axis, with

(Eq’n 4)

After they collide with their electrons, then, those photons could carry an x-ward quantity of linear momentum that differs from zero by as much as ±Δp, with

(Eq’n 5)

and p representing the photon’s full y-ward linear momentum. That fact necessitates, by way of Newton’s third law of motion (which remains valid in the quantum theory), that the electron recoil in the x-direction by the same amount in the opposite direction. Thus we have a necessary imprecision (ungenauigkeit, the German word that Heisenberg used) of 2Δp in our knowledge of the electron’s linear momentum, although we still seem to have perfect precision in our knowledge of its position at the time of the collision. And we seem to have the possibility of improving our precision in measuring the linear momentum of the gamma photon (and, thus, of the electron) by reducing the ratio r/f.

    Niels Bohr, who had a fuller knowledge of optics, told Heisenberg of a phenomenon that he had missed, one that destroys all possibility of gaining perfect knowledge through the gamma-ray microscope. In addition to focusing the radiation, the lens acts as an aperture and thus diffracts the gamma-rays. Referring to Lord Rayleigh’s (John William Strutt, 3rd Baron Rayleigh: 1842 Nov 12 – 1919 Jun 30) criterion for the resolving power of the lens,

(Eq’n 6)

Bohr noted that the gamma photons coming from points within a distance Δx from the origin would appear on the detector as indistinguishable from those coming from the origin itself.

    Now into Equation 5 we make the substitutions p=h/λ (in which λ represents the wavelength associated with the gamma ray) and sinθ=tanθcosθ=(r/f)cosθ and get

(Eq’n 7)

Changing the ratio r/f changes both Δx and Δp, but in opposing ways, so if we multiply Equations 6 and 7 together, we get

(Eq’n 8)

That looks very much like the famous uncertainly principle, but with an adjustable parameter. But we know that

(Eq’n 9)

so if we try to make è approach ninety degrees, we must make r very much larger than f. But in that circumstance Δp approaches p, thereby giving us the maximum possible uncertainty in the momentum. On the other hand, if we try to make θ approach zero, we must make r very much smaller than f and, in consequence, Δx blows up. We cannot, by any means, eliminate the imprecision from the gamma-ray microscope, so we must balance the two uncertainties by selecting an angle, entirely arbitrarily, in the middle of the range. If we choose θ=34.95 degrees, then 1.22cosθ=1 and we have

(Eq’n 10)

which just gives us the standard form of Heisenberg’s uncertainty principle.

    But Bohr took the analysis a big step further. He knew that deBroglie’s hypothesis had made quantum mechanics an optical mechanics, so he knew that particles do not so much move as they propagate. Not only the gamma photon, but the electron has a wave nature. And that nature must have a specific shape, that of a wave packet.

    We have assumed that a gamma photon has a single specific wavelength (and thus a single specific frequency, because the product of wavelength and frequency must equal the speed of light), but such a thing would have infinite extent and thus not correspond to a real photon. A proper description of the particle requires that we superimpose upon that wave a set of waves with wavelengths both longer and shorter than the primary wavelength, which we use to calculate the particle’s linear momentum. In accordance with Fourier’s theorem, mutual interference of the waves localizes the particle’s existential and dynamic properties; in particular, those properties get localized primarily within a span of Δx. Additional parameters that describe the wave packet include Δt, the time it takes for the bulk of the wave packet to pass a given point; Δν, the frequency interval in which the greater bulk of the wave packet’s frequencies lie; and Δk=Δ(1/λ), the range of wave numbers (the reciprocal of wavelength) in which the greatest bulk of the wave packet lies.

    Those numbers represent necessary imprecisions in measurements made on a wave packet. Bohr knew a theorem of optics that inter-relates those numbers for a wave packet whose shape minimizes those imprecisions as much as Reality allows. That theorem states that

(Eq’n 11)

and

(Eq’n 12)

Those relations apply to all wave phenomena; in particular, they apply to the matter waves of Louis de Broglie’s theory. By using the Einstein - de Broglie relations (E=hν and p=hk=h/λ) Bohr rewrote those equations as

(Eq’n 13)

and

(Eq’n 14)

the inequalities normally associated with Heisenberg’s uncertainty principle. But Bohr’s analysis showed that the quantum theory goes far beyond simple uncertainty to full-out indeterminacy, the absence of any possibility of getting a precise measurement of any two dynamically conjugate quantities.

    The indeterminacies in energy and linear momentum raise a fundamental question. At the quantum level do those quantities still obey their classical conservation laws? If so, do they obey those laws perfectly or only on average? Bohr noted that in order to verify those conservation laws we must do so precisely; which means, we must use circumstances in which we can measure energy and linear momentum with perfect precision, with consequent loss of precision in determining locations in space and time. Under other circumstances we cannot say whether the conservation laws remain valid: we must necessarily say that we cannot verify their validity. But we can note that the same action conjugates that appear in Equations 13 and 14 also coordinate with each other in Nöther’s theorem. We must thus refer to that theorem of spatio-temporal symmetries to see whether the conservation remain valid in the quantum realm.

habg

Back to Contents