A to-scale representation of the Astronomical Units of distance from the Sun to Saturn. The distance to Earth’s orbit is 1AU
Space is big, really big, as the opening of The Hitchhiker’s Guide to the Galaxy sagely informs us.
The distances between celestial objects are so mindbogglingly vast that specialist units are needed to chart it.
To express in miles the distance from Earth to the edge of the observable Universe, for example, results in the unwieldy figure of 270,000,000,000,000,000,000,000 (give or take).
Even using mathematical notation to shorten it to 2.7×1023, it’s still so esoteric as to be near meaningless.
What space needs is really, really big units of measurement.
Astronomical Units, or AU, are the smallest, simplest and oldest of these astral measurements.
One AU is equal to the radius of Earth’s orbit around the Sun; or, to be more precise, the average radius since Earth’s orbit is elliptical.
An AU is defined as 149,597,870,700m (about 93 million miles), a value officially set by the International Astronomical Union (IAU) in 2012.
Astronomers have been trying to calculate the distance from Earth to the Sun ever since, in the 3rd century BC, Archimedes estimated it to be around 10,000 times Earth’s radius, or 63,710,000km – so he was nearly halfway there.
Not bad for someone who lived 2,000 years before the telescope was invented.
It wasn’t until 1695 that Christiaan Huygens made the first close guess of 24,000 Earth radii (152,904,000km) though some science historians dismiss his calculations as more luck than judgement, preferring to cite Jean Richer and Giovanni Domenico Cassini’s rigorously calculated 22,000 Earth radii (140,162,000km) as the first scientifically plausible estimate (despite the fact they were further from the mark than Huygens).
A and B show how a nearby star appears to move against its background when Earth is at different positions; C is equal to an AU; D is a parallax angle of one arcsecond; E is a parsec
Lightyears and parsecs
The AU remains a useful unit for distances within the Solar System.
But the Solar System is a tiny corner of the Universe and much bigger units are needed once we go beyond it.
A lightyear is defined as how far a beam of light travels in one year – around 9.5 trillion km.
If you want to be precise, the IAU regards a year as 365.25 days making a lightyear 9,460,730,472,580,800m.
The germ of the concept originated with Friedrich Bessel, who in 1838 made the first successful measurement of the distance to a star outside our Solar System, 61 Cygni.
In his findings he mentioned that light takes 10.3 years to travel from 61 Cygni to Earth.
He wasn’t seriously positing the idea of lightyears as a unit; for one thing the speed of light at the time had yet to be calculated accurately.
However, the concept was too enticing to ignore and by the end of the 19th century it was in general use, even if some astronomers ever since – including Arthur Eddington who called it irrelevant – have been sniffy about its use.
So why is the lightyear useful?
Take our nearest extrasolar star, Proxima Centauri.
Instead of expressing its distance in miles (38,624,256,000,000) or AU (258,064.516) – values too vast to grasp meaningfully – we can say it’s 4.25 lightyears away.
Our closest neighbouring galaxy, Andromeda, is over two million lightyears away.
So the lightyear is like metrology by metaphor, similar to “areas of rainforest the size of Wales”.
But while the general public embraces the lightyear because it’s such an easy concept to get your mind around, scientists, of course, prefer a unit that needs a diagram to explain.
Officially, a parsec is the distance at which one astronomical unit subtends an angle of one arcsecond, which would leave most people going, “Huh?”
It’s not quite as arcane as it sounds.
The parsec is based on parallax vision.
For a practical example, hold your finger in front of your eyes, then alternate closing each eye; the finger appears to leap from side to side in relation to the background.
Now imagine this on a cosmic scale.
If Earth is on one side of the Sun, when we look at a nearby star, it will appear to be in one position in respect to the stars in the background.
Six months later, when Earth is on the extreme other side of the Sun, that same star will appear to be in a slightly different position against its background.
We’re talking tiny amounts of difference, measured in arcseconds (of which there are 3,600 in one degree of sky).
A parsec is the distance to a star that would appear to move by two arcseconds over a six-month period; or, to put it another way, one arcsecond as Earth travels the linear equivalent of 1AU.
Hence the name: PARallax, arcSECond.
The term first appeared in a 1913 paper by English astronomer Frank Dyson.
A parsec is roughly 30 trillion km, or a little over three lightyears.
Returning to our previous examples, this places Proxima Centauri 1.3 parsecs away from us, and the Andromeda Galaxy nearly 800 kiloparsecs.
Hang on – kiloparsecs? Yes, even parsecs aren’t huge enough for some scales, so they’re upscaled to kiloparsecs, megaparsecs and gigaparsecs (one thousand, one million and one billion parsecs respectively).
Which means we can now inform you that the edge of the visible Universe is 14 gigaparsecs away without wearing out the zero key on our keyboard.
Space versus time: retconning the Millennium Falcon’s finest moment
For many decades, those who understood such things would sigh wearily when pulp sci-fi authors mistook lightyears for a measure of time, rather than distance.
However, in a well-known gaffe in the original Star Wars (1977), George Lucas’s script mistakes a parsec for a measure of time, when Han claims the Millennium Falcon “made the Kessel Run in less than 12 parsecs”.
The recent Solo film (rather unconvincingly) tried to retroactively explain away this discrepancy with some nonsense about shortcuts.
Dave Golder is a science journalist and writer