Network: Alan Turing and his oracle machine
Hypercomputing might be just around the corner
Monday 29 March 1999
Aside from being a principal contributor to cracking the German Enigma code during the Second World War, Turing single-handedly also formulated theories upon which much of modern computer science is built. Turing anticipated precepts of artificial intelligence decades before it became a field of academic pursuit.
Turing postulated a machine, now referred to as the Turing machine, a simple, hypothetical device that could read and be guided by marks on a length of paper tape. Working with paper and pencil, Turing used his creation to form theories that stand to this day.
He showed that there was no absolute way to protect computers from crashing bugs. It's not that the Microsofts and IBMs of this world don't want to stop crashes, it's that they can't. Turing showed that it's impossible to predict whether a given program will quit or run forever.
Give Turing's namesake machine enough tape, and enough time, it could, and could not do, exactly what modern supercomputers can, and can not do. And, while this is all remarkable stuff, Turing, who died at age 41, left unpublished and unheralded work that is even more fascinating.
One of the most fascinating of Turing's lesser-known constructs is that of the "o-machine". An o-machine had a black box that could compute the uncomputable. Positing the o (for oracle) box, Turing could show that theoretically, o machines could solve the unsolvable problems that Turing machines could not.
In short, if o machines are possible, then computing and the world, could quickly be turned on their respective heads.
A computer program consists of a long string of ones and zeroes, and can be viewed as a number. Any string of ones and zeroes is a binary number, and can be represented as an integer.
Now, imagine a memory device of some type that is storing a certain special number. This number is irrational, that is to say, it can be written as an infinite string of digits, and it is binary. It might be written as 01010010110. A very important property is that each individual number represents which programs will terminate, and which will not.
So, if a program were represented by the number 7,687,684,959, one would only have to look at the 7,687,684,959th digit of the special number, and the problem would be solved. A zero might stand for "does not terminate", and a one might mean "terminates".
The "oracle" would simply be the device that performs this task of finding the proper digit. Obviously, if the special number doesn't exist, the oracle wouldn't be able to do its job, and it's far from clear if such a number does or could exist.
It should be noted that Turing never specified the workings of the oracle box, but, neither did he ever specify the workings of the Turing machine. He concentrated on the theoretical plain, leaving later generations to work out the details. In fact, many software Turing machines have been created, including Java applets that run in browsers. The oracle comparator would be a relatively simple software construct.
That computers are ubiquitous shows that the details of building practical Turing machines were relatively straightforward. One begins to wonder if the o machine might be an attainable goal as well.
O machines could perform tasks that Turing called hypercomputing, which could go far beyond the capabilities of extant computers. That the human brain routinely goes far beyond the capabilities of even the most advanced computers, is, in the mind of some, an indication that o machines and hypercomputing are possible.
Modern computers, no matter how powerful, have great difficulty doing tasks like recognising a face, that even a small child can do easily. One could take the view that if a human brain is a naturally occurring instance of an o machine, given its great power, then Turing's special number must exist. If thats all true, then like the Turing machine the only trick remaining is to work out the details.
Hypercomputers are likely to be very difficult to create. We might assume that they're thousands of years in the future, except that we already know that things we expect to take centuries often happen in decades or less. Jules Verne predicted television at the end of the 19th century, saying it would take a thousand years to perfect. It actually took less than 50, and many observers think the pace of invention is much faster at the end of the 20th century than it was at the beginning.
So Turing's hyper-computing might be just around the corner. With it might come greatly enhanced abilities to do things like stock market prediction. But even that might pale in contrast to stopping computer crashes.
Review: Imaginative storytelling returns with vigourfilm
Bannatyne leaves Dragon's DenTV
Arts & Ents blogs
- 1 Game of Thrones author George RR Martin says 'f*** you' to fans who fear he will die before finishing Westeros saga
- 2 PornHub begs users to stop uploading video clips of Brazil getting beaten 7-1
- 3 Why I'm on the brink of burning my Israeli passport
- 4 War is war: Why I stand with Israel
- 5 L'Oreal cuts ties with Belgium supporter Axelle Despiegelaere after hunting trip photographs
Sustained immigration has not harmed Britons' employment, say government advisers
Australia facing international condemnation after turning around Sri Lankans at sea
7/7 memorial defaced on anniversary of 2005 attacks with ‘Blair lied thousands died’ graffiti
War is war: Why I stand with Israel
Even when it brutalises one of its own teenage citizens, America is helpless against Israel
Socialist Worker called to apologise over ‘vile’ article saying Eton schoolboy Horatio Chapple's death is ‘reason to save the polar bears’