Science: Ahead of the times

So accurate are atomic clocks, they keep better time than the planets. Dan Falk reports on the leap second that will align the two this New Year
Click to follow
TIME, IT HAS been said, is nature's way of keeping everything from happening at once. That may sound like a frivolous definition, but it's about the best we can do. Time, like space, is a fundamental part of our world. Ask a physicist or a philosopher to define it more precisely and, if they don't immediately change the subject, they'll probably tell you what you already know: time, they will say, is what clocks measure.

Our species may not be ready to show off its deep understanding of the nature of time at the next intergalactic, interdisciplinary symposium, but we could probably give a decent lecture on how to measure it. From the first stick-in-the- ground sundials to the current crop of atomic clocks, the story of time- keeping parallels the story of civilisation itself. Some historians will even argue that our civilisation advanced because we learned how to measure time. Either way, clocks have become ubiquitous. If we don't know the time, we feel a bit lost.

For proof that we take time-keeping seriously, consider the "leap second". At first, it sounds silly to reset a clock by such a trivial amount. At the end of this year, however, the keepers of the world's atomic clocks will do just that. The reason is fairly simple: the Earth is slowing down. If we kept our clocks on solar time - as we did until about 30 years ago - this wouldn't be an issue. But it's now the atom that marks the flow of time. And left unchecked, atomic time would eventually get out of step with our spinning planet. "Atomic clocks are actually considerably more accurate than the Earth itself," explains Jonathan Betts, curator of horology at the Old Royal Observatory in Greenwich. The leap second, he says, "asks the atomic clocks to hold their breath for one second, so that the Earth can catch up". Indeed, without such corrections, the sun would be overhead at midnight rather than noon in about 3,000 years.

Today's atomic clocks measure time by counting oscillations of the caesium atom, which vibrates 9,192,631,770 times per second. That allows the world's best clocks to keep time to about one billionth of a second (one nanosecond) per day - or about one second in 3 million years. Such accuracy may sound obsessive, but much of our electronically-driven society depends on it. Without atomic clocks, navigation - on the ground, at sea and in the air - would be crippled. There would be no global positioning systems, no digital radio, no television transmissions, no precision surveying, no synchronised computer networks and no space probes.

As accurate as these clocks are, however, the clocks of tomorrow will be even better. Scientists are now experimenting with caesium fountain clocks, in which caesium atoms are cooled to a few millionths of a degree above absolute zero (-273.15C). A cloud of these atoms is shot upward - hence the name "fountain" - and allowed to fall back down, under gravity. At this ultra-low temperature the atoms move very slowly, thus allowing the signal they produce to be measured with great precision. A prototype caesium fountain clock has been running in a French laboratory since 1993; it's been keeping time to about one-fifth of a nanosecond per day.

Another breed of time-keepers, trapped-ion machines, hold even greater potential. In these devices, single ions - atoms which carry an electrical charge - are trapped in a vacuum, where they can absorb and emit radiation at frequencies substantially higher than in conventional atomic clocks. They have the potential, researchers say, to keep time to within a trillionth of a second per day. If such a timepiece had been set up 3 billion years ago, when the first living cells were evolving on Earth, it would have gained or lost no more than a second since then. So far, however, they're still at the experimental stage. Trapped-ion clocks "are really the clocks of the 21st century," says Joan Furlong, a research scientist at the National Physical Laboratory in Teddington, just west of London. "We are very excited about these new developments."

A clock - any clock - is a physical process that repeats in a regular fashion. When our earliest ancestors first became aware of the passing of time, it was probably by observing these cycles in nature: the daily rising and setting of the sun, the monthly cycle of the moon, the annual parade of the seasons. As early as 30,000 years ago, humans may have been carving notches into rocks or bones to mark the passing of the days. One such fragment, discovered in France, could be the world's oldest lunar calendar. By 3500BC, ancient civilisations had learned not only to count days, but to divide them: they noticed that a vertical stick, planted in the ground, casts a shadow whose motion mirrors the sun's daily passage across the sky. The sundial - the first artificial clock - was born.

Within a few thousand years, sundials had become sophisticated - and common - instruments. The day could now be mapped out into hours; the hour into halves and quarters. This didn't please everyone. "The gods confound the man who first found out how to distinguish hours," lamented the Roman playwright Plautus, around 200BC. "Confound him, too, who ... set up a sundial, to cut and hack my day so wretchedly into small pieces!"

It wasn't just the day that the Romans had learned to sub-divide; with the introduction of Julius Caesar's calendar, the year was also tamed. The new calendar had 12 months, with alternating lengths of 30 and 31 days, and a "leap year" to provide an extra day every four years - a calendar only slightly different from the one we use today. The calendar, along with sundials and water clocks, had created artificial cycles that would soon be more important than the natural rhythms on which they were modelled.

According to David Duncan, author of the new book Calendar, this marked an important transition. Before Caesar, Duncan says, people thought of time as a recurring cycle of natural events; for most of them, ideas of "past" and "future" had little meaning. But these advances in time measurement, Duncan writes, "introduced the concept of human beings ordering their own individual lives along a linear progression ... independent of the moon, the seasons, and the gods."

Another thousand years would pass before the first mechanical clocks appeared. The Chinese used water-driven clocks from around 1000AD; in Europe, the first weight-driven clocks - ancestors of all modern mechanical timepieces - were installed on church towers from around 1300. At this stage, they had bells but no "hands". (The word "clock" comes from the Middle English clok, the bell that called monks to prayer.) The clock in Salisbury Cathedral, built in 1386, may be the world's oldest surviving mechanical timepiece.

Those first clocks hardly kept time better than the sundials and water clocks they replaced, gaining or losing as much as half an hour in a day. But the invention of the pendulum clock in the mid-1600s, usually credited to the Dutch astronomer Christiaan Huygens, marked a great improvement. Soon clocks kept time to within a minute or two a week, and by the early 1700s, an accuracy to within a few seconds per day was possible.

Perhaps the greatest demand for precise timekeeping came from ships' captains, who needed accurate measurements of latitude and longitude. It was that quest, in part, that led to the founding of the Royal Observatory. Charles II appointed John Flamsteed as the first Astronomer Royal in 1675, charging him "to apply himself with the most exact care and diligence to rectifying the tables of the motions of the heavens, and the places of the fixed stars ... for the perfecting of the art of navigation."

Latitude can be read off a sextant (a navigational instrument incorporating a telescope for measuring angular distances), but determining longitude was more difficult; it required knowing the time difference between the ship and its home port. In 1714, the government offered a prize of pounds 20,000, (equivalent today to pounds 500,000) to anyone who could build a clock accurate enough to measure longitude to within half a degree - a small price to pay for the technology that would secure the nation's commercial and military supremacy. The prize was claimed by clockmaker John Harrison for the chronometer he built in 1760.

The minute was a product of the Industrial Revolution. The factory whistle and the steam locomotive demanded the division of the hour into minutes, and, eventually, seconds. But railroads introduced a problem: different cities kept different local time, which played havoc with train schedules. The solution came in the 1880s, with the introduction of standardised time zones. The world was divided into 24 "slices" of longitude, each spanning a width of 15 degrees. Clocks in each zone would keep the same time, running exactly one hour ahead of, and one hour behind, the neighbouring zones. After a great deal of debate, the international community recognised Greenwich as the starting point of longitude; to this day, the brass line that marks the Prime Meridian draws thousands of visitors to Greenwich every year.

By the 1860s, pocket watches were being mass-produced, and the wristwatch became popular - first with women, until their use by soldiers in the First World War gave them a more masculine image. The wristwatch, historians argue, helped to "democratise" time; once every factory worker had a watch, a line was drawn between the time he owed to his boss, and the time that was his own.

In the 1940s, the invention of the quartz oscillator allowed precision timekeeping beyond the reach of the best mechanical clocks. These in turn were superseded by caesium clocks, such as the ones operated today by Dr Furlong and her colleagues at the NPL. Setting the official time, however, is a task that requires more than a single clock - or even a single laboratory. Instead, 200 clocks in nearly 50 national laboratories send data to the International Bureau of Weights and Measures in Sevres, outside Paris. The Bureau calculates a kind of average, and feeds that signal back to the participating labs. The result is Coordinated Universal Time, the modern version of GMT.

In the atomic age, time has been divorced from the heavens. Once, we looked up toward the sun to know the time; now we look down, relying on the output of a machine. But nature may yet have the last word: astronomers have noted that the dense, rapidly-spinning stars known as pulsars emit flashes of light at extremely regular intervals. Pulsars "are almost as good, and perhaps even better as measures of time, than atomic clocks," says Cambridge cosmologist Sir Martin Rees. "So astronomers have, in modern times, discovered a new kind of cosmic clock."

Rees happens to hold the position once held by Flamsteed; he is England's current Astronomer Royal. But he is able to investigate problems that Flamsteed could only have speculated on - like the question of how time began (his new book, appropriately, is called Before the Beginning). Time, he says, was probably born along with the universe itself - created in a colossal explosion, known as the Big Bang, about 15 billion years ago. And, incredible as it sounds, our current physical theories let us probe, with confidence, back to within a fraction of a second of that event. Beyond that, however, things get murky. The conditions during those first fleeting moments - when all the matter and energy of the universe was packed virtually into a single point - demand a better theory than any now available. "As we get nearer to the Big Bang, we have to jettison many of our common-sense ideas," Rees told a lecture audience recently. "We must think carefully about what we mean by time.'"

With the year 2000 looming on the horizon, many people will indeed be thinking about time - though probably not with quite as much zest as the cosmologists. And, more than likely, they'll be looking forward rather than back. The future, as always, is full of uncertainty, but there are a few things we can be sure of. Our clocks will continue to improve, while the natural cycles they imitate - the spinning Earth, the orbiting moon - will march on. But time itself, no matter how finely we can measure it, will very likely remain an enigma.

A brief history of time


Bone carving in France which suggests early lunar calendar


Solar calendar in use in Egypt


First sundials


Stonehenge is begun


First water clocks


Babylonians adopt seven-day week


Julius Caesar proclaims calendar with 365 days; 366 on leap years (average year is 11 minutes longer than true year)


Water-driven mechanical clocks in use in China


First weight-driven clocks (accuracy: around 30 minutes per day)


First spring-driven clocks


Galileo uses a pendulum to measure time


Pope Gregory XIII reforms calendar (average year is 26 seconds shorter than true year)


Dutch astronomer Christiaan Huygens builds first successful pendulum clock (accuracy: around 10 seconds per day)


Royal Observatory founded


Britain accepts Gregorian calendar


English clockmaker John Harrison builds chronometer accurate to 1/10 second per day, allowing sailors to determine longitude at sea


Big Ben installed in Parliament tower


Standard time zones adopted, with Greenwich Mean Time as reference


Einstein shows time is affected by motion


First quartz-crystal clocks (accuracy: around 1/500 second per day)


First atomic clock


Annual world-wide watch production reaches 100 million


Second is defined as 9,192,631,770 vibrations of a caesium atom


First quartz-crystal watches


Coordinated Universal Time (UTC), based on atomic clocks, runs independently of earth's rotation; "leap seconds" introduced


First caesium fountain clock


Atomic clocks accurate to one billionth of a second per day