At the height of this age, technology will produce fruits of which we have only dreamt. The mundane decisions of life, together with tedious chores such as manually turning on the television set, sending a letter through the post or heaving home a bag of groceries, will long since have been taken out of our hands by computers that anticipate our every need.
Yet the children of the Seventies who fantasised in their school compositions about "What It Will Be Like in the Year 2000" are still waiting for the advent of personalised travel capsules and know-it-all robots which once seemed sure to follow in the footsteps of R2D2. Are children of the Nineties just as misguided in believing that such inventions of the mind are likely to become reality?
Charles Jonscher, who trained in electrical sciences at Cambridge and now runs an investment firm, is refreshingly sceptical of the assumption that computers, in their various incarnations, hold the keys to richer life in the 21st century. In his new book, he insists instead that their role will always be secondary to the human beings who designed and created them.
A more interesting issue is the way that computers have helped to redefine the identity and the culture of the age. He writes: "We certainly do not need to buy into a new philosophy of life, a sort of cyber-ontology in which the meaning of existence has been solved by deciding that we are computers." And he adds: "The computer revolution is a subplot in a bigger revolution: the explosion of human knowledge in all its forms."
With his argument that humans, rather than technology, will always have the upper hand, Jonscher begins a fascinating unravelling of where the "digital age" has sprung from, with all its limitations and possibilities. While lauding the technology which could now record every moment of a human life by means of a tiny bit of silicon implanted in the brain - the apocryphal "soul-catcher" chip - he points out that the human brain itself has 20 billion neurons, capable of 100 trillion connections (a single neuron can connect with 80,000 others). "Comparing a neuron to a single silicon switch?" he asks. "The intelligence of a single-cell organism less evolved than a neuron, such as a paramecium, is such that it can navigate towards food and negotiate obstacles, recognise danger and retreat from it. How does your PC compare?"
There are some illuminating definitions here: knowledge, notes the author, is a state of being, while information, which comes from the root "to inform", is transitive, and to be used fleetingly.
Jonscher also levels the stun-gun at some sacred cows, such as the idea that artificial intelligence could evolve to take over the world in the manner described in Philip Kerr's thriller novel Gridiron. And after a delve through the scientific theories lying behind the evolution of IT, he goes on to trace its development, with its impact on and creation of multimedia and the Internet, economic progress and the "productivity paradox", and the technologies of tomorrow.
For anyone who has ever asked what the IT revolution is all about, and how it will affect them, this readable and authoritative account, with its occasional dashes of dry humour, will fill some of the gaps.
Best of all, Jonscher never loses sight of his own argument. As he succinctly sums up: "We must not mistake gigabytes for wisdom".