Not long ago I came across a piece in the Scientific American archives from the earliest days of very-long baseline radio interferometry, the technique employed by the Event Horizon Telescope. As readers of this blog will know, the Event Horizon Telescope is a planet-size array of radio telescopes, currently being developed, that will soon be used to image the black hole at the center of the Milky Way. Radio astronomer Ken Kellermann–now a senior scientist at the National Radio Astronomy Observatory–was around for the development of very-long baseline interferometry, and in 1972, he wrote a twelve-page feature story on the subject for this magazine (PDF, purchase/subscription required).
It’s a fascinating document for several reasons. First, it’s just good, with one of the clearest explanations of long-baseline radio interferometry (in which “two relatively small antennas act as opposite edges of a single huge radio telescope”) I’ve read. But the article is interesting for historical reasons as well. As Kellermann makes clear, a major motivation for developing very long baseline interferometry was to get a close look at the core of quasars, which Maarten Schmidt had discovered just nine years before.
In 1972, quasars remained brain-meltingly mysterious. They were far too bright and far too distant to be easily explained. As Kellerman writes: “The energy required to account for the observed radiated power is exceedingly large, and the problem of the origin of this energy and its conversion to relativistic particles has been one of the most challenging in modern astrophysics.”
It is now generally accepted that quasars are extremely distant galaxies that contain at their cores supermassive black holes on feeding rampages. This idea was in the air when Kellermann’s piece was published–John Wheeler had coined the term “black hole” in 1967, and in 1969 Donald Lynden-Bell had proposed that quasars could be powered by supermassive black holes–but it clearly wasn’t well established. Instead, quasars posed a challenge bordering on a crisis: “Some astronomers believe that here we have reached the limit of conventional physics and that only fundamentally new theories will explain the seemingly fantastic energy output of galactic nuclei and quasars.”
Finding a solution to that crisis would involve peering deep inside quasars, studying “these incredibly tiny objects in sufficient detail to unravel the complex phenomenon that produces the intense radio emission,” as Kellermann writes. Peering deep inside quasars would require the construction of telescopes with unprecedented angular resolution. Pairs of radio telescopes placed sufficiently far apart could do the job, but they would have to be ridiculously far apart–on different continents.
At the time, radio interferometers consisted of two antennas spaced a few kilometers apart and connected by cable. These are called connected-element interferometers. Kellermann proposed that “tape-recording interferometers” could extend baselines to span continents and oceans. In this scheme, each antenna in the interferometer separately records the signals it receives; later, scientists compare the two recordings. If they properly synchronize the recordings and the signals are in phase, then the noise will cancel out and the cosmic signal will emerge louder and stronger. Kellermann writes that the possibility was considered as early as 1961 in the U.S.S.R., but the technology wasn’t ready: “The use of tape-recording interferometers to study the much weaker radio emission of radio galaxies and quasars had to wait until stable atomic frequency standards and high-speed tape recorders were commercially available.”
If you’re interested in the subject, you might want to check out Kellermann’s piece. It gives a great sense of how far both astrophysical knowledge and astronomical technology has come in the past few decades. It’s also interesting that the technique that astronomers might soon use to take the first picture of a black hole was developed forty years ago, unwittingly, for that very purpose.