Clarissa Ai Ling Lee is a Phd Candidate in the Literature Program at Duke University, specializing in science and technology studies and comparative media studies. She also possesses an undergraduate degree in physics and a MA in English from the University of Malaya in Kuala Lumpur. She blogs alternately at

Contact Clarissa Ai Ling Lee via email.

Follow Clarissa Ai Ling Lee on Twitter as @normasalim. Or visit their website.

Follow Clarissa Ai Ling Lee on Twitter as @normasalim. Or visit their website.

**Introduction**

Thinking diagrammatically as a way of conceptualizing our world has been in existence from the moment the first cave-person picked up a soft ‘rock’ and started making markings on the walls of his/her dwelling. As civilization progressed, humanity moved into recording; via tablets, papyri, and paper; their activities and learning for posterity.

As we all know from studies made of ancient civilizations in Egypt, Ur, and Sumer, many of the earliest inscriptions and scribbles were simplified renditions of visual observations made by the scribes. Among these observations were agricultural and seasonal cycles, cosmological alignments that could be seen with the naked eye (not too difficult when there were no human-created pollutants), and medical therapeutic practices (which tend to be connected to astrological predictions).

Of course, there were records of politics, observations of particular traditions, and stories of great wars and battles. However, the chronicling of agriculture, astronomy, architecture, engineering, medicine, and mathematics were the ones that contain some of the most interesting diagrammatic mappings, which perhaps grew out of a desire for creating elegant explanations and representations to what were observed.

Hence, some of the earliest diagrams, unsurprisingly, contained our perception and morphological view of the world. Later, this outline would be filled with more complex symbolic representation, representations provided through the mathematics later developed to simplify and abstract analyzable content from a very complicated and multi-layered nature.

As the culture of the diagram in science is magnificently wide in breadth and scope, for this post, I will concentrate on a few related examples in physics and one from mathematics. For part II, I will then look at the cultures of the diagram in the biological (including medical) and chemical sciences, with input from natural history. In the final installment (part III), I will go into the relationship between developments in the history of the technology of printing and book-making with the use of diagrammatic illustrations of two and three dimensional formats in scientific books.

**A Little Prehistory**

Before going further, it is probably judicious to provide a definition of the diagram from which the rest of the post will draw from. According to John Bender and Michael Marrinan in their book, The Culture of Diagram, the characteristic of the diagram includes “reductive renderings, usually executed as drawings, using few if any colors; they are generally supplemented with notations keyed to explanatory captions, with parts correlated by means of a geometric notational system (7).” They went on to include a broader representation of the term by also including a mid-nineteenth century definition from the Oxford English Dictionary that states that the diagram can be used to represent, symbolically, the process and outcome of actions and their processes, and the variations characterizing them.

Hence, for the Greeks of antiquity, the diagramming of mathematical thought in lines and circles were for the purpose of calculations as well as note-taking to preserve a symbolic representation of the physical/natural edifices that are the objects of their analyses. The mapping of constellations and planetary motions through proportional representations were also the reason why the diagrams were used as both computational and notational devices, since the ancient astronomers wanted to approximate the movements across the heavens of the celestial objects, often for the purpose of projecting seasonal and climate changes, and even for astrological projections.

Many of the elaborate diagrammatic writings can be seen in the almanacs and numerical tables that were produced during that time. The geometrical drawings were also extended to produce early versions of engineering drawings of mechanical devices such as projectiles, pulleys, aqueducts, gears, catapults, ramps, and various forms of hydrodynamical and land-based mechanical instruments. Beyond the mundane were also more lofty interests among the thinkers of antiquity to consider some of the scientific puzzles involving aerodynamics, optics, and the properties and substance of matter, though much of their work in this area remained theoretical blueprints.

Nevertheless, the ancient Greeks (as well as Arabs’, Persians’, Jews’, Chinese’, and Indians’) contributions to the development of mathematics, and its mechanical potentialities, were to influence the onset of the Renaissance, with growing interest in the intersection of arts and science from the late Middle Ages. The most famous example of such an interdisciplinary development of the diagram are the sketches and drawings produced by Leonardo da Vinci and his contemporaries.

**Drawing the numbers**

A fundamental aspect of ‘number theory’ is the set theory. The set theory represents relationships between types and categories of numbers (rational, irrational, real, imaginary, even, odd, prime). However, the numbers in themselves mean nothing unless their significance can be demonstrated, which means foregrounding their relationship to each other and functions of each other. The Venn diagram provides that tool for such logical representation of relationships. If you remember math from high school, you would have remembered seeing the simplest representation of that diagram.

Here is a list of possible simple constructions of the Venn diagram: imagine a circle standing for Set A and another circle representing Set B, each situated independently of each other. Each of these sets could represent a specific category of any objects you can think of. It is possible that Set A and Set B would remain completely independent and unrelated to each other but you may still want to combine them and form Set C that is a composite of Set A and Set B; it is also possible that they could be connected because they have certain objects that bear similar characteristics or which are the same. It is also possible that Set B is a subset of Set A. The diagram’s usefulness in graphing out logical representation has made it a popular instrument even in non-mathematical circumstances for showing, as a quick image search on Google with the keyword ‘Venn diagram’ will elicit.

Introduced in 1880 by John Venn, the diagram allows a geometrical representation of otherwise abstracted logic and algebraic forms, and has many applications in probability theory, topology, abstract algebra, algebraic-geometry, and high-level logic. In fact, the development of the Venn diagram has within it the intention of locating moments of symmetry in all the relationships between the numbers, and such an intention would later be translated into making sense of the microscopic and invisible worlds operating at a scale outside our everyday consciousness, which is where I will be headed to in my next few examples.

**The Lightcone of Time**

The connection between space and time was documented even before the onset of Einstein’s Special Relativity, for the 16th century French philosopher and mathematician René Descartes and his contemporaries, and later, Isaac Newton, were interested in the position of substance in our three dimensional space. The only difference is that they did not yet, at the time, attempt to add time as the unequivocal fourth dimension to the three dimensional space, which makes time absolute and separate from the coordinates of space.

However, much of the geometrical diagrams one can find in Newton’s Principia are concerned with understanding how substance is displaced and the absolute and relative position of motion in relation to space, while also attempting to go beyond a purely mechanical interpretation of the dynamics and kinematics of the material objects observed. However, this changed with the discoveries made through the theory of Special and General Relativity, especially with their experimental verification.

The first level of change happened when the three-dimensional Cartesian view of space, via three dimensional Euclidean geometry, transformed into the Minkowski view of three dimensional space with time as that imaginary fourth ‘space,’ even if the internal property of time is still different from that of space. The Euclidean geometry of the Cartesian coordinates rests on remains; one mainly thinks of the relativistic space-time as Galilean coordinates; in the superscript indices of 1 to 3 over x to denote the three coordinates in space, time is given the superscript index of 0. Time is now no longer absolute, and the space-time diagram demonstrates its transformative capacity. However, the early years of the Special and General Relativity involved the unplugging of our conception of space and time from the stubborn dominance of a Newtonian framework.

Moreover, the point coordinates represented by the Cartesian paradigm became ‘world’ lines, and the infamous space-time lightcone, which you might have seen in Stephen Hawking’s A Brief History of Time, was born. However, long before Hawking, Herman Weyl, a German mathematical physicist and friend of Einstein, diagrammed the first crude representation of the lightcone in his book Space-Time-Matter which was first published in English (translated by Henry L Brose) in 1922 (though the German version was published about four years previously, not long after Einstein published his world-changing treatise), but a more sophisticated representation of it has since developed in current physics textbooks.

There are three properties of the light cone as determined by the property of its geodesic vector. The worldline of the timelike property represents the movement of ‘massive’ objects below the speed of light, an object that traverses the lightlike curve would have to travel at the speed of light, while an object traversing the timelike curve must go beyond the speed of light.

The lightcone represents the mathematics of Minkowski metric and tensor algebra, as well as that of the Lorentzian transformation, which is the transformation between two reference frames moving relative to each other at constant velocity. What looks like an intuitive depiction about the possibility of ‘time-travel’ actually contains complicated maneuvering of constants and variables with superscript and subscript indices, as well as of manifold geometry.

**Footsteps in the Chamber**

The development in relativity coincided with developments in radiation science that gave way to developments in nuclear physics, atomic theory, and also physical chemistry. From the late teens up to the thirties, there was also development in quantum mechanics, which came about because of the inability of the Newtonian classical mechanics to account for certain observed phenomenal at the sub-atomic level, that were to later inform all these other fields.

Work in theory continued in tandem with experiment. One of the most exciting development that came out of that of trying to use quantum mechanics to explain certain observed physical phenomena in experiments was work relating to the cosmic ray, and the traces they left behind in a successive of ‘chambers:’ the emulsion plates, cloud-chamber, bubble chamber, and later, the increasingly sophisticated and complex particle detectors.

Peter Galison, in Image and Logic (University Chicago Pres, 1997), has written an exhaustive tome on the history of image beginning from the development of early work in radioactive science up to when the science of particles were no longer performed manually by human scientists, but by their computerized ‘sidekicks.’

Despite of the increased level of mediation between the scientist and the microphysical objects of his/her work, the form of image produced have not altered drastically, even if their presentation went from tracks on the analog photographic plates to vertices on computer screens. What is interesting here, instead, is the sense that the tracks left behind in the chambers and emulsion plates were what-you-see-is-what-you-get, while in today’s accelerators, one could only see them as digitally reconstituted images.

Hence, we have naturally occurring ‘diagrams’ of our physical world, as well as the ones that have been re-interpreted for us. Many of the traces and ‘strange’ manifestations of these tracks were the ones that point the physicists to the possibility of an unknown and yet to be predicted entity. The discovery of the neutrino comes to mind when we think about the Wilson cloud chamber’s attempt at tracking the gamma rays produced through the process of the electronic inverse Compton scattering in the laboratories, or other naturally processes involving electromagnetic radiation.

The detective work performed by studying the lines of flight of these tracks were instrumental to the discovery of increasingly fundamental particles, beyond that of the electron and the nucleon it orbits. As the tracks represent the branching and decay of the nuclear atoms, they also represent the different lifetime and constitution of the atoms. Developments in this area through the fifties would coincide with developments in field theory (also known as quantum field theory, which is a field-relativistic version of the ordinary quantum mechanics). Therefore, the Feynman diagram, becomes the tool for turning extremely complicated field-theoretical mathematics developed by another physicist into more intuitive forms of computation by embedding some of the more complicated mathematical mechanism in the vertices of the diagram. This brings us neatly to the next section.

**The story of infinities: particles and fields**

Infinities have been considered problematic by mathematicians because that is where everything breaks down and the logic we are familiar with ceases to make sense. It is the same case for particle physicists who also have to deal with the problem of infinities in quantum field theory. Hence, what is known as the Feynman diagram was invented to deal with the problem of infinite divergences in the mathematics of field theory. The diagram also became a tool for bridging the physical realities and complexity of experiments with the approximations provided by the rules involved in the construction of the diagrams, of which I will not go into here.

The Feynman diagram serves as a catalyst for understanding the calculations, processes, and microstate interactions involving the physical building blocks of nature through space and time, while also representing a visual actualization of ‘virtual’ and ‘real’ particles. The ‘virtual’ particles are added as a solution to the problem of conservation in connection with the total energy produced by the particles that interacted and then emerged from that interaction. The external line of the diagram represents the ‘real’ particles while the inner line represents the ‘virtual’ particles. One could also, technically speaking, ‘go back in time’ through the diagram.

The diagram has influenced the developments of modern QED (quantum electrodynamics), which is also known as the electroweak force stemming from the unification of the weak (a force with very short effective range) and electromagnetic forces (which James Clerk Maxwell had unified, geometrically, through his work on electricity and magnetism). The diagram mediates alternative formulations of classical fields that could not be dealt with quantum mechanically as a way for working around the problem of infinite energies when the perturbative quantum field method (where a complicated regime of particle interaction is mediated by a scaled down version for easier calculation) used to deal with this problem fails to work.

Feynman envisioned his diagrams through his attention, and preference, for thinking in terms of a particle instead of a field, though later work on the diagram by his younger contemporary, Freeman Dyson, fused the particle way of thinking with that of the field, thus making the diagram one of the most common tool for computation in particle physics. Most importantly, the use of the Feynman diagram activates a function known as renormalization that enables the infinities to be ‘subtracted’ out, mathematically speaking.

**Picturing the Unseen Extra-Terrestrial**

It is probably easier to think about the notion of imaging and the diagram from the viewpoint of cosmology, astronomy and astrophysics than from any of the micro-physical examples I have just discussed at length. After all, the astronomers have been imaging and diagramming the constellations for centuries, as the introduction of this post suggested, and the necessity for such diagramming was not only out of curiosity to find out more about forces of nature, but also because of the day-to-day necessity.

There are very many examples of the diagram that one can pick when looking at space science. Instead, I would like to focus my discussion on how diagramming can take place when the objects one wants to observe is invisible, such as intergalactic material (IGM) and dark matter. One might argue that diagramming what that cannot be readily observed takes on a tone of importance so that all observable clues could then add up to provide a profile of that invisible entity. This would mean diagramming the electromagnetic light spectrum that is a phenomenal effect of particular interactions of the objects, the diffusion of background radiation or density of the space purportedly occupied by the object, the behavior of the visible entities when they are around these invisible material, and graphing the signals in correlation to the effective range of the former’s emergence.

Some of such attempts at diagramming the invisible have been made with the aid of telescopes such as the Fermi Large Array telescope. This telescope is built for detecting signals from invisible particles by diagramming its range of energetic spectra emitted and collating these spectra for analysis. The same has also been done with another large array telescope called the Cerenkov Large Array telescope that marshals the same detector used in galactic neutrino and proton-related observations to look at non-thermal high energy particles that could be the window to these invisible entities, and thus, a view into the genesis of our universe.

**Images:** Tablet; DaVinci; lightcone; cloud chamber; vertices; Feynman diagram; dark matter

Get 6 bi-monthly digital issues

+ 1yr of archive access for just $9.99

Add a Comment

You must sign in or register as a ScientificAmerican.com member to submit a comment.