The galaxy NGC 1365 aglow with H-alpha light that tends to show star forming zones(Credit: ESO)

They really are.

The universe is apparently well past its prime in terms of making stars, and what new ones are being made now across the cosmos will never amount to more than a few percent on top of the numbers already come and gone.

This is the rather disquieting conclusion of a new and significant study of the rate at which stars have been produced through cosmic time.

Sobral and colleagues recently published the results of a series of 'snapshots' made of galaxies busily making stars at different epochs, from about 4 billion years ago (around the time of Earth's formation) all the way back to nearly 11 billion years ago. This is no simple task, some of the world's largest and most sensitive telescopes had to be employed.

By observing light at very specific frequencies (corresponding to emission from warm hydrogen atoms - see the note below) they are able to gauge the actual rate at which new stars are condensing out of thick nebular material in a few thousand galactic systems. This yields some very robust statistics on the global changes in the numbers of new stars being made as the universe ages.

The main conclusions come in two parts. First, 95% of all the stars we see around us today were formed during the past 11 billion years, and about half of these were formed between roughly 11 and 8 billion years ago in a flurry of activity. But the real shocker is that the rate at which new stars are being produced in galaxies today is barely 3% of the rate back 11 billion years ago, and declining. This indicates that unless our universe finds a second wind (which is unlikely) it will only ever manage to produce about 5% more stars than exist at this very moment.

This is, quite literally, the beginning of the end.

However, despite the provocative title of this post you shouldn't actually expect to see the stars start to disappear from view too soon. The great majority of stars in the universe are less massive than the Sun, and in the Milky Way about 75% of all stars are less than half as massive. Smaller stars last longer - the nuclear fusion of hydrogen happens slower in their cores and they also tend to churn their innards more than a star like the Sun, resulting in some extraordinary efficiency. Indeed, the smallest stars (so-called M-dwarfs) should have trillion-year lifetimes.

So we live at an interesting time, at the cusp between exuberant excess and a long gentle decline. We also happen to live in a galaxy that still produces a few stars a year - the Milky Way is going to end up contributing nicely to that last 5%. And when the Andromeda galaxy comes lumbering into us in 4 or 5 billion years time there may be a sudden burst of new star formation as these two beasts merge, and a final sprinkling of new stellar beacons will - for a time - light the cosmos a little more.

[As a small side note, a number of otherwise respectable media outlets - like this - mistakenly reported that the astronomers performing this study had been detecting 'alpha particles emitted by Hydrogen atoms'. This is such a heinous error that I feel compelled to mention it.

First of all, no hydrogen atom in the history of the universe has ever emitted an alpha-particle since an alpha-particle consists of two protons and two neutrons - in other words a helium nucleus.

Second, the astronomers were actually just measuring light emitted in the hydrogen-alpha atomic transition, when an electron drops from the third major allowed energy level around a proton to the second, and spits out a reddish photon. These reddish photons are shifted to lower and lower energies or frequencies as they pass through the expanding universe, and for the really distant objects studied in this case they end up in the infrared part of the spectrum. But they're a nifty way to pin down regions in galaxies where stars are forming and the H-alpha photons penetrate through gas and dust quite well, so can escape to eventually be seen by the likes of us.]