On Boxing Day last year, a distraught traveller posted a cry from the depths of his heart on the Auckland community website. "I lost an Olympus digital camera during a trip to Waiheke Island on 29 November," he wrote. "I'm from overseas and have all my NZ experiences and memories in it."
All my NZ experiences and memories.... We've come a long way. Once, this unfortunate traveller would have known that his cri de coeur was what rhetoricians might call a "synecdoche": using the container for the thing contained. "Pass the milk," we say, when we mean, "Pass the bottle which contains the milk". Only an über-geek would quibble. The rest of us do it all the time.
But "memories"? Even the camera companies which have cleverly taken to calling photographs "memories" know they're pulling a fast one. Photographs aren't memories; surely they're containers-for-memories, or memory-joggers? Further down the line, we can say photographs are (or should be) proofs, reports, records, evidence – they can even be works of art. And in a few, rare, photographers like Cartier-Bresson, Robert Doisneau or Brassaï, the graphic line and imagination collide, merge, and produce something new.
But not memories. Thinking of rhetoricians brings to mind the greatest living practitioner of political rhetoric, the young Jon Favreau, Barack Obama's speechwriter. And thinking of Obama brings to mind his daughter Malia, who, at her father's inauguration – one of the most-photographed occasions in history – could be seen taking photo after photo of famous or cool people, shaking hands with Dad.
Further back was a woman who appeared to watch the inauguration on the screen of her camera. She was there. It was real. But perhaps it wasn't really real unless seen on that great tyrant of our culture: the screen. Doesn't matter what screen. Doesn't matter how big or how bright or what resolution. If it's not on-screen, it's not happening.
None of us saw it coming. Twenty-five years ago, when the creatives' favourite computer, the Apple Mac, was born, the screen was a shy little thing. You turned it on, did some work, then turned it off again. Writers printed stuff out, switched off their screens and sent their copy to be typeset. Accountants transcribed their pencil ledgers into primitive spreadsheets then turned off their computers and sat back, rubbing their eyes.
And photographers? Photographers didn't turn it on at all; they still did their work on to silver halides, on film, and in red-light darkrooms. None of us foresaw a time when almost every human activity would be mediated through the glowing matrix of an LCD screen. None of us foresaw the time when the world would become flattened and constrained to the 23-inch rectangle of the widescreen monitor. None of us saw the loss of texture: of snapshots in envelopes and flimsy orange negatives, of slides in mounts and finding the projector and gathering the family. All of us still thought a photograph was something that followed the event, usually after a week's wait; and most of us still believed that, without a projector, a photograph was something that could only be looked at by two or three people at a time.
Over the past decade, though, the photograph has become a commodity; a commodity that (once you've bought the camera) is more or less free. That, and the equally unforeseen rise of the net, the speed of broadband, and the fall in the cost of storage, has meant that this has been the most widely documented decade in human history.
It's time for a new law. In 1961, Arthur C. Clarke wrote that "any sufficiently advanced technology is indistinguishable from magic" (to which Larry Niven responded that any sufficiently advanced magic was indistinguishable from technology). What has become clear over the past decade is that any sufficiently cheap technology will become compulsory. Cheap, almost free, digital photography and cheap, almost free, publishing through the likes of Flickr and MySpace and YouTube: these have led to the paradigm of human activity as being something which is verified by being first recorded, then published.
If not... did it happen? Was I really there?
We've even managed to take ourselves out of the necessary loop. Back in the day, a photograph required focusing, exposure, winding on, taking out of the camera, a trip to the chemist's, a wait, another trip to the chemist's and, at last, the chance to see which ones had come out. No longer. On my iPhone I can press a (virtual) button and the camera will not only take the picture, but will publish it, instantly, on the social networking site of my choice. I do not even have to look at it.
The changes brought about by technology have altered more in our culture than simply making it cheaper and easier to make photographs. One of the many retrograde steps, under the guise of progress, is that the "decisive moment" named by Cartier-Bresson and instinctively understood by all photographers (as opposed to just people-with-cameras) has become next to impossible to do with an inconspicuous camera. Press the button on an old film camera and the shutter fired almost instantly. Now, press the button and all sorts of things happen. There are inexplicable pauses, whining noises, a suspicious, synthesised click which makes you think it's done so you move the camera, and then, finally, it takes the snapshot. A good thing they have a screen: at least you can see what it's actually photographed as opposed to what you wanted it to photograph.
More importantly, perhaps, the nature of the photograph has changed. Its transience makes it seem less real: press delete and it's gone. The passage of photons through the lens no longer effects a permanent change. The image is ultimately disposable. Digital technology's potential for almost infinite duplication, too, has changed the game. Once there was a thrill in going to a photography exhibition and seeing pictures "for real" – not printed, but made from light passing through the original negative and on to paper. Nor does it feel "real" that the photograph is, like everything else, just another damn thing on the screen. It has no texture. It doesn't curl in the hand. The head and shoulders of a love object snipped carefully from a 5x4 print is more real than the same thing Photoshopped neatly from a jpeg file leaving no trace of its theft at either end.
Perhaps this is why documentary reportage has almost vanished: the images are no longer so real, and the making of an image – the idea that something is "worthy" of an image, which we all instinctively did when we only had 12 or 24 or 36 frames in our cameras – is no longer special. Nor is the idea of the reporter, the photojournalist, much respected; we are all photojournalists now: citizen journalists, with opportunities for reproduction and distribution of which the great smudgers of the past could only dream.
The relationship between image and reality has changed. We no longer read photographs as texts, but as a commentary on themselves. "Here is proof," says the photo, "that I was here", but "I" isn't the person who took the snap; it's the photograph itself. If you can't remember where you were or when, your friends, or those you publish the picture to, can transfer it to Google Maps, choose satellite view and zoom in to the building where, at 2.17am on 7 January, you were snapping Jezz on your Nokia and uploading it to Facebook. The whole enterprise was conducted to produce a public image. Sometimes I get a spooky feeling we're being elbowed aside, becoming Morlocks to the cameras' Eloi. What's going on in my computer? Armed with all that data – when, where, how high, how bright – and the endless cross-referability of the web, are my photos becoming custodians of themselves? Is the computer looking at them on my behalf? What is iPhoto doing when it's not active? Was I there... or was it just my photographs?
Clearing through my father's papers after he died, I found his photo folder. A real one, made of battered shagreen. In it was a picture of his long-dead brother; one of his father as a young man; one of his wife as a 13-year-old girl with her mother and sister. Pictures of the dead. Pictures of people who could not be seen in reality, ever again, kept private in his desk drawer. Quite at odds with our way of looking now. But so was the idea of photos on a telephone. Why, he asked, would you want it? "Because they're both media," I said, clever me, "and so converge." "Well," he said, "stew and treacle pudding are both food, but you wouldn't want them on the same plate."
And reality and photographs can both be seen... but I wonder what the young woman at the inauguration will see when she looks at her pictures; or whether she will look at them at all.
Join our new commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies