Writing in The Independent on Sunday recently, the distinguished film writer David Thomson went even further: "It has become harder this past year to go back into the dark of the movies with hope or purpose. The place where 'magic' is supposed to occur has seemed a lifeless pit of torn velour, garish anonymity and floors sticky from spilled Pepsi. Forlornness hangs in the air like damp." For him, too, the temple had been desecrated and cinema betrayed into a second childhood, its seniority bringing not increased wisdom but a childish, sulky glee at catastrophe and violence. The arch corrupters, by a strange paradox, turn out to be Hollywood's pre-eminent merchants of innocence - Spielberg and Lucas - who had infantilised the art, and, even worse, seduced it from its fidelity to the real world. "The great age of movie-making was founded in the delicate faith photography kept with reality," lamented Thomson, but now it was possessed by a new infatuation, the promiscuous sensual satisfactions of computer-generated imagery that make anything filmable. And if anything is filmable, then the commercial value of novelty insists that it must be filmed. An inflationary boom in no-brainer awe is set in motion.
I think both these accounts are far too sombre and fearful, though it would be difficult to argue that cinema, and in particular Hollywood cinema, is not in a genuine state of distress just now. And computers certainly seem implicated, either as a subtly poisoning technology or as a direct rival. But in both cases I think there are grounds for optimism, for believing that the gloom is not the approach of twilight but a passing cloud. It is, at first glance, depressing that so many modern entertainment films now aspire to the condition of the computer game; in The Rock, Sean Connery must enter Alcatraz by means of a narrow tunnel that is blasted by alternating jets of flame. Penetration requires a nice judgement of speed and timing. The architecture here has nothing to do with reality, although the script offers a cursory explanation that the device has something to do with refuse disposal (who pays for the prodigious gas bill, you wonder?). But anybody who has ever tried to guide Mario through the labyrinth will immediately recognise this for what it is - a standard scrolling-game hazard. In Sylvester Stallone's latest film, Daylight, there is an almost identical peril to be negotiated - giant fans through which Hollywood's least convincing animatronic must pass before he can reach safety and win the game. On the way, just as in a computer game, he will pick up objects that have to be employed against future obstacles and will accumulate knowledge that seems useless at the time but turns out to have an unforeseen application. This is the movie as a computer game without any buttons. Of course, things aren't quite this simple - computer games deploy a pre-existing set of narrative tricks, often drawing as much from cinema as they give (in the early days, the games replicated the movies rather than the other way round, pixellating stars into tiny sprites who would stiffly run through versions of their screen adventures). But the gap between the virtual world they explore and the imaginary universe of Hollywood commercial cinema has narrowed to the point of invisibility. And if this is a combat for adolescent minds, it is one that cinema must lose. As the technology of immersion and real-time rendering improves, computer games will offer thrills of active participation that cinema simply cannot match. But the defeat might still prove a salvation. If computer games pillage from cinema the very thrills that are stifling it - the dumb kinetic pleasures of speed and impact and vertigo - then we may yet have cause to rejoice at the theft. Deprived of these dependable resources, film will be forced to invent new ones or restore the old - the wells of human feeling and complexity that feed all great films. It's worth remembering that photography did not kill painting - as the more short-sighted artists feared - it liberated it and even instructed it.
Here too, though, the pessimists have grounds for anxiety, citing the growing dependence on computers as a means of creating the very images we see. The world will not be recorded, they suggest, it will be forged - with exactly that world's doubled sense of heavy industry and fakery. Rather than butterfly-hunting for beauty, the director will number-crunch his vision into existence ("What we are seeing," writes Thomson, "is less cinematography than an intricate set of rigged effects"). Nor will it stop there: "I have a sneaking suspicion that if there were a way to make movies without actors, George would do it," said Mark Hamill of the director of Star Wars, George Lucas, whose company, Industrial Light and Magic, is the world leader in computer-generated spectacle. This certainly sounds like the doom of humanist cinema but, again, the terror may turn out to be temporary rather than permanent. For the moment, it's true, computer imagery is still an end in itself rather than a means. Twister was not a good story with some brilliant action scenes - it was a show-reel of special effects sequences to which a narrative alibi had been added (an unconvincing alibi). But this sense of vacancy is often the case with new technologies. Lumiere's first strips of film had no other purpose but to demonstrate their own magical facility. The Arrival of a Train at La Ciotat Station, the very first blockbusting movie, was not morally instructive, suspenseful, comic or poignant. It told no story and its title is actually a comprehensive synopsis of its contents. What's more, the thrill it induced in its audiences had nothing to do with the beauty of La Ciotat or the rarity of steam-trains. It took an absolutely banal vision and transformed it into magic by projecting it in a place it should not have been. And the rapid depletion of such marvels was essential to the development of cinema, not inimical. Those who suggested that the gimmick would fade had reckoned without Griffiths and Murnau and Eisenstein.
The cutting edge of computer graphics induce in us something of the same innocent wonder. Toy Story is interesting in this respect, a film that only young children can watch in a sophisticated way, if we take sophisticated to mean blase about the technology of representation. When my five-year- old watches Toy Story, he empathises with Woody's feelings of rejection (as indeed he should); when I watch Toy Story, on the other hand, I find myself waiting for marvels of depiction, details that induce something not far short of adoration (it is, I confess, a dumb kind of pleasure). The dangling strap of the removals van that Woody clings to in the closing sequence of the film is more exact in its weave and the lithe physics of its motion than seems possible. It is not that it is as realistic as a film of the same object. It is more realistic, possessed of a resolution and control that are quite alien to cinema. This wonder is similar to that experienced by the Lumiere brothers' first audiences, but it is different in one important respect - because one of its components is the sense of a shared perception, a common experience of tiny things being noticed.
It won't be long before real talents become bored by the ingenuous splendours of such devices and begin to explore what other things they might achieve. Before, in other words, the technology ceases to be a toy and starts to be a tool. When that happens, computer imagery will submerge beneath the surface of the film, will become an inconspicuous component of its glamour and enchantment rather than a self-advertising one. It will be invisible, not in the sense that you won't be able to see it, but in the sense that you will look straight through it to the real subject of the film. Cinema has been condemned to death by technology before - by the arrival of sound and even, for some devotees, by the introduction of colour. It survived both those corruptions and it will survive this one too, if only the congregation in the dark can keep their faith in testing timesnReuse content