Fellow Scientific American blogger John Horgan is at it again. This time he is heralding the end of fundamental physics based on the increasing time lag between Nobel Prizes awarded for fundamental discoveries. There's actually a grain of truth in his analysis; for instance the prizes awarded for quantum mechanics in rapid succession in the twenties and thirties tell us how fast these fields were growing, a scenario that's unlikely to repeat itself.
The analysis is also a little deceptive.
To see why, let's imagine the Nobel Prize being established much before, in 1700 instead of 1900. For his working out of the laws of motion and gravitation, Isaac Newton would have surely gathered his prize for his monumental work Principia, published in 1687. But what then? There were certainly great scientists like Hooke, Huygens, Boyle and Cavendish in the 18th century and many of them might have rightly received the prize. Perhaps Coulomb would have received it for his law of electrostatics, formulated in 1798 and Benjamin Franklin might even have received it for demonstrating that lightning is a form of electricity. The prize might have been awarded to Count Rumford (Benjamin Thomson) for his very important discovery of the relationship between mechanical work and heat.
But very few of these scientists made discoveries that were as fundamental as Newton's and very few of their discoveries are of the stature of Dirac's equation for the electron, special relativity or the uncertainty principle. So if you charted the list of Nobel Prizes between 1700 and 1800 you might feel rather despondent about the future of fundamental physics, as John seems to be feeling right now. In fact it would be fair to say that a discovery as fundamental as Newton's came more than a hundred years later when Faraday discovered his law of electromagnetic induction. After that fundamental discoveries seem to have been more frequent; among 19th century developments we would undoubtedly include Maxwell's formulations of the equations for electromagnetism as well as the working out of the laws of thermodynamics by Clausius, Joule, Kirchhoff and others. All of these discoveries were more than Nobel-worthy. But the point is that whether you think there is a correlation between increasing time lags for Nobel Prizes and fundamental discoveries depends as much on the time period you choose to analyze as it does on other factors.
When you extend the analysis to greater time periods you realize that the history of science often consists of relatively fallow periods punctuated by upheavals. This was in fact the view propagated by Thomas Kuhn when he wrote about "normal science" and "paradigm shifts". Now let me be clear that I actually agree with John that discoveries of the kind made during the heyday of nuclear physics or quantum mechanics are indeed singular; you can only discover the atomic nucleus once. But I also think that correlating the frequency of Nobel Prizes with the potential for fundamental discoveries in physics can be an endeavor fraught with temporal complexity. It is as conceivable to imagine that we are now in a fallow period and at the cusp of an upheaval as it is to imagine that we are running out of fundamental things to discover.
The other reason why people like me constantly feel like breathing a sigh of resignation when I read articles like these is because our reaction inevitably is, "So what?". The fact is that the majority of the world's scientists don't work on fundamental laws, and those of us who work on the application of those laws are as happy working out the consequences of the laws as their progenitors were in discovering them. The illusion that almost every physicist must be obsessed with fundamental laws is one that has been created by a hype-happy media and a rash of biased popular science books on topics like quantum mechanics and cosmology. In addition, what's called "fundamental" depends on the field of science. For a chemist, a theory of (emergent) chemical bonding is as fundamental as the theory of quantum electrodynamics would be to a physics. For an ecologist, a theory of the rise and fall of predator and prey populations is as fundamental as Maxwell's equations are to an electrodynamics expert. Neuroscience seems to be on the cusp of discovering a lot of fundamental principles in its own domains. What's applied to one kind of scientist may well be fundamental to another kind, so the boundaries between the two are never sharply drawn.
One thing that we should not forget is that even if in principle we stopped discovering fundamental laws, 99% of the world's scientists would still go on happily plying their trade without batting an eyelid. It's not that we don't care about fundamental laws or don't appreciate their significance and beauty - we very much do - it's just that for us the word "science" encompasses so much more than the quest for universal principles. Discovering a drug for cancer, working out human ancestry, building an artificial leaf to harvest solar energy, building the Giant Magellan Telescope, watching non-Newtonian fluids flow, even communicating science to the public - all these activities are as fascinating, important and gratifying as unifying quantum mechanics with gravity. So even if we discover fewer and fewer fundamental laws, science will continue to offer more and more to beguile and intrigue our minds. Fundamental or not, it will always be a source of progress and intellectual stimulation. What more could we want?