The mechanism was found in 1902 – exactly 115 years ago – and the discovery changed the way we saw the ancient Greeks. The man who found it, Valerios Stais, initially thought that the bronze mechanism he could see was a gear or a wheel – but on closer inspection, the metal turned out to be part of the Antikythera mechanism, an ancient and mysterious analogue astronomical computer.
The machine could be used to track and predict where the planets were, understand lunar and solar eclipses, and count towards the next Olympic Games. But its uses were various – it might also have been used for unknown forms of mapping and navigation.
It did all that with the work of more than 30 sophisticated gears. That machinery is hidden inside the case, which is the size of a shoebox, and have been revealed using computer models and special scanning techniques.
The mechanism is thought to have been created around 150BC, and was more intricate than relatively modern clocks. Its old age and spectacular workings give it a good claim to being the first computer – but is that really true?
But first – what are computers?
The first known use of the word computer was in 1613, and described a person who did the kind of work that electronic computers do today. That meaning of the word clung on right until the 20th century – the film Hidden Figures shows human computers at Nasa who are in the process of having their work taken over by electronic ones – but started to change through the 19th century.
As such, the word "computer" has always been fairly broad, encompassing everything that can do calculations and solve problems. There's no strict definition – and so it's impossible to say exactly what makes a computer – meaning that a lot of different things can claim to be the first one.
What were the very first computers?
In the broadest sense, computers have existed for years – long before Antikythera mechanism was made, and certainly before it was found. They might have been around for as long as we've had the capacity to count and calculate.
A notable example is the Ishango bone – a legbone of the baboon that might have a good claim to being the predecessor of the computer, if not the first. The bone has a series of notches carved into it, and though it's not known what those notches represent, it's often been suggested that they are a way of counting or calculating calendars.
From there, people used a wide variety of tools to aid in their calculations, often based on a similar idea. The abacus is thought to have first been used in 2400 BC, and helped people count everything from money to animals – some of the processes and principles used in what we know as the Roman abacus continue to inform the high-powered computers that we continue to use today.
So what are analogue computers, like the Antikythera mechanism?
The first analogue computers began to appear relatively soon after that. Such computers make use of mechanisms – not the digital, on or off systems that we mostly rely on today – but rely on much of the same logic to help people work things out.
The first of those analogue computers is said to have been the Chinese "south-pointing chariot", which emerged in the first milennium BC. As its name suggested, it had a pointer mounted on it that was set to go south at all times. It did so not by using magnets, but with gears: so long as it was pointed south at the beginning of its journey, it would use a complex system that offset any turning of the chariot so that the pointer on top was always going in the same direction.
Then came the Antikythera mechanism, which is commonly known as the first example of such analogue computers. It used its complex mechanism to calculate a range of different things – and with the kind of complexity that wouldn't been known for centuries. As such, it is a worthy thing to give the title of first computer to – even if it doesn't fit many modern definitions.
How did we get from the Antikythera to today's computers?
Development continued on such computers, much of it focused on mapping and understanding the planets. Muslim astronomers in particular provided a range of different innovations that helped increase the accuracy and complexity of such analogue computers.
Eventually, in the 17th century, many of those same principles helped people create the first mechanical calculators. They were much larger than today's calculators – and analogue – but allowed people to do much the same things: adding and subtracting numbers, at first, and eventually adding more complexity like multiplication and division.
When was the first 'real' computer made?
The Antikythera mechanism might take the title of first analogue computer – and has often been referred to as the first computer. But it doesn't do the kind of operations that we'd expect of a computer today, and doesn't do what it does using very similar processes.
The first computer that can really take the name is commonly thought to be Charles Babbage's difference engine, and the work that followed. Among other things, it was the first real general-purpose computing device: it wasn't made just to calculate one thing, but included a range of different components that meant a skilled operator could work out a range of different information.
That realisation was found by Ada Lovelace, one of the heroes of computing. She was the person that realised that Babbage's machine wasn't just helpful in doing calculation – it was in that realisation that the algorithm, and the modern idea of computing programming and programmers, was born.
What are the other firsts?
The first programmable computer – perhaps the first modern, functional computer – came in the late 1930s. It was called the Z21 and was made by German Konrad Zuse in his parents' living room.
The first electric programmable computer arrived soon after, in the early 1940s. It was called the Colossus and was used by British code breakers to decrypt German messages and read them.
The first digital computer arrived properly very soon after, with the ABC. This is one of the "first computer" claims that has official backing – in the 1970s, a US judge granted that the ABC's creator, John Vincent Atanasoff, was officially the inventor of the electronic digital computer.
But the most famous example of an early digital computer is called the EINAC. Though it isn't officially the first digital computer, it was the first fully functional one – and so perhaps wins the more meaningful first.
From there would come a range of other leaps: the first computer that could store programmes, built in Manchester; the first commercial computer, which was sold in the 1950s; the first to include RAM. Over time they would become much smaller and much faster.
Perhaps the next remaining "first" to be broken is the first properly working quantum computer. That for now is just theoretical – but if researchers pull it off, it could vastly increase the world's computing power in one tiny step.
Join our new commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies