[Note: I am in the final throes of book writin', which will be all-consuming for the next several weeks. So Jen-Luc Piquant has jumped into her personal TARDIS (it exists in the virtual world, okay?) and brought back a 2007 post on particle/wave duality for your reading pleasure -- lightly edited and updated with a new lede. 100% original blogging will resume in August, after we take a much-needed vacation. In the meantime, we hope you enjoy these blasts from the cocktail party's past.]
Over at Ars Technica, Matt Francis has the scoop on a nifty new quantum interference experiment using large molecules of phthalcyanine and its derivatives. The molecules are quite large — comprised of more than 100 atoms — which means this experiment is approaching that critical threshold where quantum effects (the subatomic realm) give way to classical physics (the macroscale realm in which we live).
This isn’t the first time molecular-scale particle/wave duality has made the news. Back in 2007, Lawrence Berkeley National Laboratory researchers announced that they had performed the world’s smallest double-slit experiment and determined that quantum particles will start behaving in accordance with classical physics at the size scale of a single hydrogen molecule. Physicists excitedly discussed these marvelous results with a passion most people reserve for Super Bowl Sunday. But the average reader’s eyes probably just glazed over with incomprehension, leaving him/her to wonder what all the fuss is about.
Truthfully? It’s tough to grasp the significance of these cutting-edge quantum wrinkles without a bit of background about Thomas Young’s original 1802 experiment (now the poster child of the quantum concept of particle/wave duality), as well as the historical scientific debate that raged around the nature of light.
Particle or wave? That was the question. It proved to be an especially contentious issue; the debate raged for millennia, in fact. Pythagoras, in 5th century BC Greece, was staunchly “pro-particle,” while Aristotle (who lived a couple hundred years later) was ridiculed by contemporaries for daring to suggest that light travels as a wave. The confusion was understandable, because empirical observations of the behavior of light contradicted each other.
On the one hand, light traveled in a straight line and would bounce off a reflective surface. That’s how particles behave. But it also could diffuse outward, and different beams of light could cross paths and mix together. That’s undeniably wave-like behavior. In short, light had a split-personality disorder.
By the 17th century, many scientists had generally accepted the wave nature of light, but there were still holdouts in the research community — among them no less a luminary than Sir Isaac Newton, who conducted many experiments with light, including his famous experimentum crucis, using a prism to demonstrate that white light was actually comprised of many different colors of light added together.
Newton argued vehemently that light was comprised of streams of particles that he dubbed “corpuscles.” In 1672, colleagues persuaded Newton to publish his conclusions about the corpuscular nature of light in the Royal Society’s Philosophical Transactions.
He seemed to assume that his ideas would be greeted with unanimous cheers, and was rather put out when Robert Hooke and the Dutch physicist Christian Huygens were reluctant to jump on the Isaac Bandwagon. The result was an acrimonious, four-year debate. Huygens differed with Newton on such key points as how the speed of light changes as light goes from a a less dense medium like air to a denser material like glass: Newton said it should increase; Huygens said it should decrease. The issue remained largely untested because at the time there was no good way to measure the changes in speed.
Ultimately, Newton’s stature as one of the greatest physicists of all time ensured that his notion of streams of corpuscles won out over the wave theory of light — until that cheeky over-achieving upstart, Thomas Young, appeared on the scene almost a century later. Young was the oldest of 10 children born to a Quaker family in Somerset, England, and proved to be alarmingly precocious.
He could read by the age of 2, learned Latin by age 6, and by the time he was 14, he’d added Greek, French, Italian, Hebrew, Chaldean, Syriac, Samaritan, Arabic, Persian, Turkish, and Amharic to his linguistic repertoire. His facility with languages served him well later in life, when he became fascinated with Egyptian hieroglyphics and played a key role in cracking the code of the Rosetta Stone by deciphering several Egyptian cartouches.
Young first studied medicine at Cambridge, then earned a physics doctorate in Gottingen before setting up shop as a physician in London. By age 28, he’d been appointed a professor of natural philosophy at the Royal Institution, delivering lectures about his experiments in everything from optics, acoustics, climate, and the nature of heat, to electricity, hydrodynamics, astronomy, gravitation, and measurement techniques.
The term “polymath” hardly seems to do him justice; his fellow students at Cambridge used to call him “Phenomenon Young.” No wonder his epitaph at Westminster Abbey salutes him as “…a man alike eminent in almost every department of human learning.”
Ah, but could this brilliant young phenomenon take on The Goliath of Physics and win? Young was actually a huge fan of Newton and based his early work on color and vision on the insights Newton gleaned from his experimentum crucis. But that didn’t mean he accepted the Great Man’s conclusions without question.
His pivotal experiment didn’t start out as the poster child for the quantum concept of wave/particle duality; like every other scientist of his day, the fact that light might be both was simply inconceivable to Young. So he designed an experiment he believed would determine the matter once and for all.
Naturally, a darkened room was involved, along with a light source (probably a candle, or sunlight, this being the early 19th century). Young shone the light onto a barrier in which he’d cut two narrow, parallel slits, about a fraction of an inch apart. On the other side was a white screen.
He reasoned that if light were made of particles, as Newton claimed, the screen would show two bright parallel lines where the light particles had passed through one slit or the other. But if light were a wave, it would pass through both slits, separating into secondary waves that would then recombine on the other side — i.e, they interfere with each other.
It’s a bit like water waves, which have crests and troughs. As the secondary light waves recombine on the other side, wherever two crests or troughs line up exactly, they produce a bright spot of light. Wherever a crest and a trough line up exactly, they cancel each other out, leaving a dark spot on the screen. The resulting “interference pattern” is thus a series of alternating dark and light bands. And that’s exactly what Young observed, even making his own sketch of the interference pattern. Light, per his experiment, was undeniably a wave.
Young was understandably pretty chuffed at the success of this experiment, which offered the strongest evidence to date in favor of the wave theory of light. He applied his findings to explain the shifting colors found in thin films, such as soap bubbles, and even tied the seven colors of Newton’s rainbow to wavelength, calculating what each color’s approximate wavelength would have to be to produce that particular color of light.
Alas, his euphoria was short-lived: the pro-Newton crowd lost no time in bashing Young’s experimental findings. One simply didn’t question the Great One, even 80 years after Newton’s death. Online encyclopedist David Darling memorably described it as “the scientific equivalent of hari-kiri.” Young was too, well, young to known better. Newton’s place in the pantheon ensured that the scientific community largely ignored Young’s pivotal experiment for a good 10 years, bolstered by a simply savage review of his work in the Edinburgh Review (published anonymously in 1803, later revealed to have been authored by one Lord Henry Brougham, a big-time Isaac acolyte.)
Fortunately for the wave-friendly fans of light, French physicist Augustin Fresnel conducted a series of more comprehensive demonstrations of Young’s basic experimental setup, succeeding (where Young had failed) in convincing the world’s scientists that light really was a series of waves, rather than streams of tiny particles.
And in the mid-19th century, another Frenchman, Leon Foucault, proved that Huygens had been correct — and Newton mistaken — in his assertion that light travels more slowing in water than in air. Given the acrimony Huygens experienced from Newton for sticking to his guns on this issue, one would understand if the Dutch scientist indulged in a little “Nyah, nyah, nyah” type of gloating from beyond the grave.
(It probably helped that the French were a bit less worshipful of Newton than the Brits. Jen-Luc Piquant urges us to remember that even the greatest scientists are often wrong. Huygens, in fact, was partially responsible for advancing the notion that light waves travel via an invisible substance called the luminescent aether, later disproved by the famed Michaelson-Morley experiment in the 1890s.)
There were a bunch of other breakthroughs going on at the same time, of course, and taken together, everything added up to strong support for the “light is a wave” school of thought. Case closed. Or so physicists thought as the 19th century drew to a close. But light had a few more surprises in store for them with the birth of quantum mechanics. It’s too long a story to go into here, but Max Planck, Albert Einstein, and Arthur Compton were among the luminaries whose work led to the realization that light was both particle and wave: specifically, light is made of photons that collectively behave as a wave.
Sounds simple enough. Except quantum mechanics is never that simple. The revolution didn’t end there. Quantum theory predicted that even the individual photon could behave like a wave, and essentially interfere with itself. For a long time, there was no way to test this prediction. But eventually technology and scientific instrumentation advanced to the point where they could emit and detect single photons. The modern version of the experiment looks like this.
First, we need a researcher — let’s say, Paris Hilton, just to stretch your powers of imagination a little. Paris sets up a simple light source in front of a barrier with two small slits cut into it, with a light-sensitive screen on the other side to record the pattern of incoming light. Paris turns on the light source and
is hypnotized by the shiny beams sends a series of photons, one photon at a time, toward the two slits in the barrier.
We’re talking about single particles here, so the photons should only be able to go through one slit or the other, and just strike the screen like so many tiny ping-pong balls. Instead, Paris is stunned to find that the light forms that telltale interference pattern — alternating bands of dark and light — on the screen on the other side. What the heck? This means that those single photons are behaving like waves; each photon somehow travels through both slits and interferes with itself on the other side.
Now Paris wants to know more. This is a woman who reads Sun Tzu, after all; her natural curiosity drives her to repeat the experiment with an extra twist: she places particle detectors by each of the slits, so that she can verify that the photons do, in fact, each go through both slits at the same time.
Except this time, she doesn’t get the interference pattern; she gets the ping-pong ball effect, which means that the photon is now behaving like a single particle, passing through one slit, but the other. Are the photons just messing with her?
Unable to cope with the quantum conundrum, Paris Hilton’s head explodes. Millions rejoice. Tabloids mourn. And those mischievous photons give an evil cackle of delight at having claimed another victim.
The good news is, the photons aren’t deliberately messing with our heads. There is an explanation for the two results, but it’s an explanation that defies common sense.
Instead of merely tweaking her first experiment, thanks to the addition of the particle detectors, Paris unwittingly performed a completely different experiment the second time around. In the first version, she’s making a wave measurement; in the second, she’s making a particle measurement. The kind of measurement she chooses to make determines the outcome of the experiment.
If Paris just lets the photons travel from the light source to the screen undisturbed, they behave like waves and she sees the interference pattern. But if she observes them en route, she knows which path the photons took; this knowledge forces them to behave like particles, passing through one slit or the other. Paris can construct her experiment to produce an interference pattern, or to determine which way the single photons went. But she can’t do both at the same time. Heisenberg’s Uncertainty Principle rears its ugly head.
Hence the opening line of the Berkeley press release from 2007: “The big world of classical physics mostly seems sensible: waves are waves and particles are particles, and the moon rises whether anyone watches or not. The tiny quantum world is different: particles are waves (and vice versa), and quantum systems remain in state of multiple possibilities until they are measured — which amounts to an intrusion by an observer [Paris Hilton!] from the big world — and forced to choose: the exact position or momentum of an electron, say.”
There’s a lot of really big ideas contained in those two sentences, more than we can even attempt to discuss intelligently in a single blog post. Vast tomes have been written about this, countless papers are published each year in academic journals.
But we were most impressed with the sheer ingenuity of how they constructed their experimental set-up. They used the two proton nuclei of a hydrogen molecule as the two “slits,” separated by a mere ten-billionths of a meter. The tricky part is to separate the component parts of the hydrogen molecules in the first place. How the heck did they manage that?
It helps if you have access to a couple of x-ray beam lines at LBL’s Advanced Light Source. All you need to do (are you taking notes, Paris?) is send a stream of hydrogen gas through the apparatus into an “interaction region” (the equivalent of an enclosed chamber, would be my guess), where some of the hydrogen molecules run afoul of that nasty x-ray beam, which has sufficient energy to knock off each hydrogen molecule’s two negatively charged electrons. Without that negative charge to balance things out, the two positively charged protons that form the nucleus of the molecule blow apart from the powerful mutual repulsion.
The LBL researchers then used an electric field to separate the particles according to charge, sending the protons to one detector and the electrons to a detector in the opposite direction. Genius! LBL researcher Ali Belkacem calls this “a kinematically complete experiment,” because it accounts for every single particle, enabling them to figure out all kinds of things, like “the momentum of the particles, the initial orientation and distance between the protons, and the momentum of the electrons.”
It’s not just photons that exhibit wave/particle duality: electrons do, too. So even a single electron is capable of interfering with itself. Just like the classical version of the experiment, the scientists could study the electrons as particles, or as waves. For instance, they found that once the electrons were knocked off the hydrogen molecule, one was fast, and one was slow, giving them an assortment of both fast and slow electrons.
Mostly they were interested in the interference pattern, particularly at what point it disappeared. They essentially turned the slower electrons into teensy particle detectors by boosting their energy levels just a tad. This turns the slow electrons into “observers.” They are “big” enough to interact with the classical domain. So the interference pattern disappears and the electrons behave almost like a classical system. I say “almost,” because apparently they still retain some signs of entanglement (what Einstein called “spooky action at a distance”).
So there you have it: the world’s smallest double-slit experiment.
A version of this post first appeared on the Cocktail Party Physics archive blog in November 2007.
Images: (top) Newton’s sketch of his prism experiment. Public domain. (center) Thomas Young’s sketch of two-slit diffraction of light presented to the Royal Society in 1803. (Thomas Young/Wikipedia). (bottom) Paris Hilton reads Sun Tzu. Source: All over the Internet, although this particular image file is from here.
Akhoury, D. et al. “The Simplest Double Slit: Interference and Entanglement in Double Photoionization of H2,” Science, November 9, 2007.
Juffmann, Thomas et al. “Real-time single-molecule imaging of quantum interference,” Nature Nanotechnology, published online March 25, 2012. DOI: 10.1038/nnano.2012.34
Newton, Isaac. (1671) “A Letter of Mr. Isaac Newton, Professor of the Mathematicks in the University of Cambridge; Containing His New Theory about Light and Colors,” Philosophical Transactions of the Royal Society of London 6, 3075-3087.
Young, Thomas. (1804. “Experimental Demonstration of the General Law of the Interference of Light,” Philosophical Transactions of the Royal Society of London 94.
Get 6 bi-monthly digital issues
+ 1yr of archive access for just $9.99