The encryption factor

Quantum computing is set to revolutionise the way we work. Trouble is, it could crack any of today's security codes in a fraction of a second, says Charles Arthur
Click to follow
The Independent Online

When bankers and spies begin to worry about advances in computing, the rest of us would do well to take notice. What makes them edgy are the advances being made in "quantum computing", which is, as might be expected from the name, as entangled and confusing a field to understand as the branch of physics on which it is based - quantum mechanics.

When bankers and spies begin to worry about advances in computing, the rest of us would do well to take notice. What makes them edgy are the advances being made in "quantum computing", which is, as might be expected from the name, as entangled and confusing a field to understand as the branch of physics on which it is based - quantum mechanics.

But a banker doesn't need to be able to understand quantum physics to know that a computer capable of breaking any of the world's encryption codes as soon as it is turned on could mean serious problems for the bank's financial system. Systems used to transfer funds around the world every day rely on encryption that takes milliseconds to encode, but, in theory, millions of years to crack, by even the most powerful computers. And governments routinely use encryption to pass on secret messages.

Quantum computing threatens that, which is why bankers and governments are paying particular interest to a field that barely existed a decade ago. Although the physicist Richard Feynman put forward the ideas that are the basis of the subject in 1982, the wider interest took off only after a researcher called Peter Shor demonstrated - theoretically - in 1994 that a computer with enough "quantum bits" could effortlessly crack modern encryption.

Interest in the field has now taken hold across the world. Professor Apostol Vourdas, of Bradford University's computing department, has just won a £62,000 grant to co-ordinate a network of universities and companies including HP and Hitachi in a quantum computing project. He is working with five PhD students. "The computing side is just one aspect of the field," he says. "The whole field of quantum technology is growing, taking in communications, computing and cryptography."

The reason for this burgeoning interest is twofold. First, computers as we know them can't keep getting faster. "Moore's Law", which was coined in the 1970s at Intel, predicts that every 18 months the number of transistors (the building blocks of processors and memory) doubles in any given area of integrated circuits, making the machine more powerful. But eventually physical reality will intervene; at some point between 2010 and 2020, the transistors will no longer be shrinkable, because the electrons that carry the signals will leak out. Some of that effect is already visible. In the past two years, Intel, IBM and Motorola, the biggest chip-makers, have had problems with the reliability of chips with 65-nanometre parts.

Second, quantum computing and calculation offer a whole new approach to solving problems. That's because quantum computing uses some of the weird aspects of subatomic particles such as electrons and photons to do its calculations. Both electrons and photos can have a quality called "spin", which is either "up" or "down"; in the quantum computing field these are often used as the "0" and "1" of conventional binary calculation.

What marks out quantum computers is that they don't have to proceed through their calculations step by step. Physicists say that any closed quantum system has a "superposition" of all possible states. Remember "Schrödinger's Cat"? This is a theoretical experiment about a cat trapped in a soundproof box with a cyanide canister that would be let off if a radioactive particle decayed, of which there is a 50-50 chance. Clearly, when you open the box, the cat is either alive or dead. But is it alive or dead before you open the box? Arguably, it is both; by opening the box, you "collapse" its alive-and-dead state into one or the other.

A quantum computer takes this idea to the extreme. The "quantum bits" (qubits) are, in theory, in every possible binary configuration, all at the same time. It's as though your computer was simultaneously doing every calculation you'd ever asked it, or ever would, or could.

If you were very determined, you could create a quantum computer using cats in boxes with radioactive particles and cyanide. When the time came to see what the answer was, you'd open the boxes. Let alive be your "1", and dead be your "0", and you'd have the beginnings of a quantum machine. Two problems, though: the RSPCA wouldn't like it, and what calculation would it be answering?

The problem of getting quantum computers to give the correct answers has stumped researchers in recent years. Yes, you can set up lots of "qubits", but getting them to collapse into the answer you want is harder.

The first working quantum computer, which was able to make calculations with two qubits, was demonstrated in 1998. IBM's Almaden Research Center then made the running for the next three years, demonstrating a three-qubit machine in 1999, a five-qubit one in 2001, and then a seven-qubit version - using 10 billion billion molecules, each of seven atoms, to show factors of the number 15.

How useful is that, you might wonder? But 15 is special, as the product of two prime numbers. Most modern cryptography encrypts data by wrapping it in a huge, unique number, produced by multiplying two large prime numbers together. While standard computers are good at multiplying, they are not so good at the reverse, called "factoring" (you can see the challenge: try working out the factors of 221).

But quantum computers, directed correctly, are able to produce instant answers to enormous factorisation challenges. They don't have to go through any intermediate steps, because one of the superposed states already holds the correct answer.

Which is where the banks and governments begin to take fright, and seek newer forms of cryptography to defend against this. Happily, the same technique that cracks the old system produces a newer, even more powerful one - much to the relief of the financial institutions that rely on secure transactions. Quantum cryptography has already moved from theory into the real world; in April 2004, Bank Austria Creditanstalt used quantum cryptography to make a real bank transfer to the city hall of Vienna.

Quantum cryptography really is unbreakable between two points. It uses a peculiar property of quantum particles - if they are generated together, they will be "entangled". This is rather like identical twins. Look at one, and you know what the other looks like. The result is that you can create two identical cryptographic keys. Send one to your recipient, and be certain of two things: their key really will be the same, and, if someone tries to tap your transmission of the key, they will collapse the system so that the recipient won't get a working key. Trying to tap quantum cryptography will prevent it being sent, but the tapper will be none the wiser about the content.

Yet some might say that quantum computing is not making the progress that was expected of it. In 2000, Brian Oakley, who chaired a European study group into the possibilities of such machines, said: "I wouldn't be surprised to see applications [of quantum computing] in five years' time." Well, 2005 has dawned, but the machines haven't.

Vourdas, however, is confident that their time will come - eventually. "In the 1940s, electronic computers were thought to be an intellectual exercise that wouldn't really be made in practice. Yet 30 years later, you could buy them in shops. Quantum computing isn't an outgrowth of classical computing, but a revolutionary step forward." The first transistors, made at AT&T in 1947, were huge. Now they're so tiny you might be surrounded by hundreds at any given moment.

Vourdas adds: "Remember that, 40 years ago, computers needed huge rooms. Presently, a quantum computer might need to be cooled to temperatures close to absolute zero [to control the photons] and a huge space. But that will change. It will go mainstream."