Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

This is the dawning of the age of informed bewilderment

The Saturday Essay

Charles Jonscher
Friday 14 May 1999 23:02 BST
Comments

In the 1930s, the French philosopher Pierre Teilhard de Chardin predicted the emergence of a noosphere, a network linking mankind at the mental rather than the physical level. Teilhard was a sociologist, a scientist and a Jesuit theologian; he described this noosphere partly in physical terms, as an information network, and partly in spiritual and philosophical language, as a force which would act to unify society. One of the many metaphors which he used to put the concept across was that of a "halo of thinking energy" encircling the planet. Today, the same combination of technical, sociological and philosophical terminology is used to describe the Internet.

Arguably, the technology - that is, the hardware - for Teilhard's vision is about to be with us. But when he spoke of a future noosphere, he did not mean only, or even mainly, the means of physically reaching across the globe. He referred to a further development in the process of human evolution which would lead eventually to the attainment by mankind of a greater unity of mind, body and spirit; there would be a sharing of purposes, ideas and values across societies. When speaking of a future networked society, a global culture or a new phase of civilisation based on the interchange of ideas, we must take care to distinguish between a network in the technical sense, and a network in the human one. Technically, we have wired the planet. At the human level, however, we may or may not be making progress; if we are, it is on a timescale vastly longer than that required to produce successive generations of computer protocols, and the process is much more complex.

There is a yawning gap between data-shuffling, on the one hand, and thinking, on the other. We are certainly digitising practically all forms of recorded information, and then processing them using digital techniques. But that does not mean that the information itself is intrinsically digital or, more importantly, that the operations which we as human beings wish to have performed on it are amenable to digital logic.

The first phase of the computer revolution was the introduction of mainframe computers to replace the armies of clerical workers, who until then had been employed to undertake routine data-processing tasks. Today, a $1,000 notebook with a Pentium chip running Office 98 can look after the housekeeping of an executive's data requirements more effectively than a $2m IBM 360, complete with its software support team. There is as little nostalgia for those big old machines as for the factory-like offices that they replaced.

But in one respect at least, that phase could be regarded as a golden age of computers. Precisely because they were so expensive and difficult to program, they were used for what they were suited to best, and people were suited to least: calculating, sifting and storing, in contexts where, as with a bank account reconciliation, there is not the slightest advantage in deploying the human/analog touch. They were used, in short, for the task of computing.

This was 30 years before digital machines would replace receptionists and answer the telephone, inspiring the Dilbert cartoon of a machine-generated voice saying to customers: "Your call is important to us. Please hold the line while we ignore it." Technology was, by this time, clearly beginning to enter into roles in which a human, rather than a computational touch, had value. Whether the bank would give you an overdraft became a decision that was handed over to an expert system program which would weigh up your credit-worthiness by algorithm, rather than to a bank officer. The computer was surreptitiously crossing the barrier from what it was ideal for - keeping track of your bank balance and calculating the interest - to what it could encode only by narrowing the problem to fit its own constraints as a computing device.

Our societies are now embarking on the long task of integrating computing machines into daily lives. Each time a new software package is released, there is a new opportunity to substitute the quick - and cheap - decision of a machine for a more cumbersome human-generated thought process. Each time this happens there is a gain, direct and easily measured. But there is also a cost incurred - maybe a small one, and greatly outweighed by the benefit, but a cost nevertheless. The person being replaced is not a data-manipulating device, a pre-programmed machine, so it is well to remember that the substitution cannot be complete - to remember, in short, who we are in the digital age.

As the world becomes comprehensively wired, we will need more than ever to understand the difference between data, information and knowledge - that raw data is a mass of symbols, information is something more useful distilled from the data, and knowledge is a still higher level of meaning: information entering the human creative process.

The new technology has done to data-processing what the old technology did to fabrication. It has introduced - at the levels of data and information, not of knowledge - mass production. Electronics has brought the cost of digitally processed data down to the cost of the software doing the processing - at the margin, practically zero. Operations which can be done by a computer, be they the solving of a tax computation or the creation of a digital movie character, become replicable and marketable. Processed information has become a commodity. The same programs available to us are available also to friends, neighbours, colleagues and competitors.

And, like the plastic ball-point pens and other mass-produced gadgets of the industrial age, they lose value as they become available to all. Floppy discs and CD-roms collect dust on our desks and in the kids' playrooms. Each of them contains millions of bytes of so-carefully-crafted software - piled up high through the economics of duplication. We have spent centuries wanting more information, so it's astonishing that it has become so cheap. But then, who could ever have imagined in the 18th century, when a simple manufactured article like a knife cost a month's income, that machinery of the complexity of thousands of knives would be left to rust in backyards - dishwashers discarded because a new model had come along, or even just because the old one did not fit the new kitchen?

What will retain value, both personally and professionally, in this age of machines that can conjure up and process information in limitless quantities, is that which computers cannot produce - just as what had value during much of the 20th century was what could not come off a mass-production line. The source of unique advantage, of value, lies elsewhere. It lies in the minds of the millions who are the ideas-creators of the post-industrial age.

James Bailey, author of After Thought: The Computer Challenge to Human Intelligence, points out that we have been taught since the time of the ancient Greeks that rational thought is the pinnacle of human mental achievement: "We as a species made a decision at some point to define human uniqueness around our intelligence." If by "rational thought" is meant logical processing, it is a bad place to plant our flagpole. Human intelligence is much broader. Schopenhauer emphasised almost everything else in our mental armoury: the power of will, drive, emotion. These are deeper planes of human interaction, and much more uniquely ours than data-sorting. If computers have devalued one particular aspect of our intellectual powers - pure computation - by cheapening it, as industrial machinery devalued the ability to pull a great load, so be it. There are plenty of human qualities left untouched, and they will become the more valued. Bailey points out that, if the emergence of powerful computers like Deep Blue forces us to realise that logical processing is not a unique quality of humanity, then that is for the better: "It's going to be a painful process, but if in that process we come to understand that we are not essentially analytical beings, that our essence is something higher, then that's a positive development."

What the future will bring by way of new combinations of human and machine information-processing we cannot know. But so far, the front-running hypothesis is that the creative engine remains the human, with the machine as its servant at a fairly low level of processing - excellent communications, good housekeeping of data, very hygienic in presentation. This hypothesis is also the one that gives us the most respect as people, and the one for which we should prepare ourselves.

It is not being Luddite or otherwise unenthusiastic about the possibilities of digital technology as a tool to refute the new value systems and philosophies surrounding it, or the notion that the technology will bring about a new phase of human civilisation. There is a middle ground, one which recognises the power of the tool while studiously avoiding the pitfall of assuming that the rationality of the new machines will be matched - whether for good or evil - by a rational approach to their use. The eminent Spanish- American philosopher Manuel Castells writes in the third and final volume of his The Information Age: "The 21st century will not be a dark age. Neither will it deliver to most people the bounties promised by the most extraordinary technological revolution in history. Rather, it may well be characterised by informed bewilderment."

Extracted from `Wired Life', Bantam Press, pounds 14.99

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in