Stacy Snyder wanted to be a teacher. By spring 2006, the 25-year-old single mother of two had completed her coursework at Millersville University in Pennsylvania and was looking forward to her future career. Then her dream died. Summoned by university officials, she was told she would not be a teacher, although she had earned all the credits, passed all the exams and completed her practical training (much of it with honours). She was denied her certificate, she was told, because her behaviour was unbecoming of a teacher.
Her behaviour? An online photo showed her in costume wearing a pirate's hat and drinking from a plastic cup. She had put this photo on her MySpace web page, and captioned it "drunken pirate", for her friends to chuckle over. The university administration, alerted by an over-zealous teacher at the school where Stacy was interning, argued that the online photo was unprofessional since it might expose pupils to a photograph of a teacher drinking alcohol. Stacy considered taking the photo offline. But the damage was done. Her page had been catalogued by search engines, and her photo archived by web crawlers. The internet remembered what Stacy wanted to have forgotten.
Stacy later unsuccessfully sued her university. She claimed that putting the photo online was not unprofessional behaviour for a budding teacher. After all, the photo did not show the content of the plastic cup and even if it did, Stacy was old enough to drink alcohol at a private party. This case, however, is not about the validity of the university's decision to deny Stacy her certificate. It is about something much more important. It is about the importance of forgetting.
Since the beginning of time, for us humans, forgetting has been the norm and remembering the exception. Today, because of digital technology and global networks, this balance has shifted. Forgetting has become the exception and remembering the default. The potential consequences are enormous.
Stacy Snyder's case is not exceptional. Dozens of cases of profound embarrassment, and even legal action, have occurred since then – from the lawyer who cannot get the internet to forget an article in a student newspaper more than a decade ago to a young British woman who lost her job because she mentioned on Facebook that it was "boring". Worldwide, perhaps 600 million people have pages on social networking sites. Disclosing one's information – through Facebook or MySpace entries, blogs, photos, networks of "links" or "friends", content preferences, "geo-tagging" or "tweets" – has become deeply embedded into youth culture. As these young people grow older, and more adults adopt similar traits, Stacy Snyder's case will become paradigmatic, not just for a generation, but for society as a whole.
Web 2.0 has fuelled this development, but conventional publishing – paired with the power of the internet – has rendered similar results. Take the case of Andrew Feldmar, a Canadian psychotherapist in his late 60s living in Vancouver. In 2006, on his way to pick up a friend from Seattle-Tacoma International Airport, he tried to cross the US/Canadian border, as he had done over 100 times before. This time, however, a border guard queried an internet search engine for "Feldmar". Out popped an article Feldmar had written for an interdisciplinary journal in 2001, in which he mentioned that he had taken LSD in the 1960s. Feldmar was held for four hours, fingerprinted and, after signing a statement that he had taken drugs almost four decades ago, was barred from further entry into the United States.
An accomplished professional with no criminal record, Feldmar knows he violated the law when he took LSD. But he maintains he has not taken drugs since 1974, more than 30 years before the border guard stopped him. It was a time in his life that was long past, an offence that he thought had long been forgotten by society as irrelevant to the person he had become. But because of digital technology, society's ability to forget has become suspended, replaced by perfect memory. "I should warn people that the electronic footprint you leave on the net will be used against you," Feldmar said. "It cannot be erased."
Snyder and Feldmar had voluntarily disclosed information about themselves. Often, we disclose without knowing. Outside the German city of Eisenach lies MAD, a mega-disco with space for 4,000 guests. When customers enter, they have to show their passport or ID card; particulars are entered into a database, together with a digital mug-shot. Guests are issued a special charge card, which they must use to pay for drinks and food. Every such transaction is added to a guest's permanent digital record. By the end of 2007, MAD's database contained information on more than 13,000 individuals and millions of transactions. Sixty digital video cameras continuously capture every part of the disco and its surroundings; the footage is recorded and stored in over 8,000GB of hard disk space. Real-time information about guests, their transactional behaviour and their consumption preferences is shown on large screens in a special control room. Management boasts how, through the internet, local police have 24/7 online access to customer information stored on MAD's hard disks. Few if any of the disco's guests realise that their every move is being recorded, preserved for years, and made available to third parties – creating a comprehensive information shadow.
For an even more pervasive example, consider internet search engines. Crawling web page by web page, Google, Yahoo!, Bing, Ask.com and a number of others index the web, allowing all of us to access it simply by typing a word or two into a search field. We assume that such search engines "know" a great deal of the information that is available on the global internet. However, they remember much more than what is posted on web pages.
In the spring of 2007, Google conceded that until then it had stored every single search query ever entered by each of its users, and every single search result a user subsequently clicked on to access it. By keeping the massive amount of search terms – about 30 billion search queries reach Google every month – neatly organised, Google is able to link them to demographics. For example, Google can show search query trends, even years later. It can tell us how often "Iraq" was searched for in Indianapolis in the fall of 2006, or which terms the Atlanta middle class sought most in the 2007 Christmas season. More importantly, though, by cleverly combining log-in data, cookies, and IP addresses, Google is able to connect search queries to a particular individual across time – with impressive precision.
The result is striking. Google knows for each one of us what we searched for and when, and which search results we found promising enough to click on them. Google knows about the big changes in our lives – that you shopped for a house in 2000 after your wedding, had a health scare in 2003, and a new baby the year later. But Google also knows minute details about us: details we have long forgotten, discarded from our mind as irrelevant, but which nevertheless shed light on our past: perhaps that we once searched for an employment attorney when we considered legal action against a former employer, researched a mental health issue, looked for a steamy novel, or booked ourselves into a secluded motel room to meet a date while still in another relationship. Each of these information bits we have put out of our mind, but chances are Google hasn't. Google knows more about us than we can remember ourselves.
Google has since announced that it will no longer keep individualised records forever, but will anonymise them after a period of nine months, erasing some of its comprehensive memory. But keeping individualised search records for many months still provides Google with a very valuable information trove it can use as it sees fit. And once the end of the retention period has been reached, Google's pledge is only to erase the individual identifier of the search query, not the actual query, nor the contextual information it stores. So while Google will not be able to tell me the terms I searched for and what search results I clicked on five years ago, it may still be able to tell me what a relatively small demographic group – middle-aged men in my income group, owning a house in my neighbourhood – searched for on the evening of 10 April five years ago.
Google is not the only search engine that remembers. Yahoo!, with about 10 billion search queries every month, and the second-largest internet search provider in the world, is said to keep similar individual records of search queries, as does Microsoft. But other organisations, too, collect and retain vast amounts of information about us. Large international travel reservation systems are similarly remembering what we have long forgotten. Credit bureaux store extensive information about hundreds of millions of individuals. For example, the largest US provider of marketing information offers up to 1,000 data points for each of the 215 million individuals in its database. In addition, doctors keep medical records and are under economic and regulatory pressure to digitise and commit decades of highly personal information to digital memory. Law enforcement agencies store biometric information about tens of millions of individuals even if they have never been charged with a crime – and most of these sensitive yet searchable records are never deleted. And in the UK alone, 4.2 million video cameras survey public places and record our movements.
This may be only the beginning. Already a number of mobile phones sport GPS receivers, making it possible to track our movements with precision. Numerous companies are marketing GPS tracking devices so that worried parents can follow the activities of their teenagers, or suspicious spouses the movements of their (unsuspecting) partners. The first digital cameras with GPS chips have appeared, adding location information to each photo or video we shoot, so that not only date and time but also the place of our mementos is etched into digital memory – memory that vastly exceeds the capacity of our collective human mind.
This is not necessarily all bad. In a number of ways, an affordable and comprehensive memory is advantageous for us, both individually and for society. However, this may also lead to terrible consequences. Privacy experts have been warning of some of these consequences for years. But the perfection of digital memory – which has been encouraged by policy-makers – has implications that go beyond the overbearing surveillance of citizens by the state. It also affects the very nature of human existence.
Forgetting plays a central role in human decision-making. It lets us act in time, cognisant of, but not shackled by, past events. It is not just an individual behaviour. We also forget as a society. Often such societal forgetting gives individuals who have failed a second chance. We let people try out new relationships, if their previous ones did not make them happy. In business, bankruptcies are forgotten as years pass. In some instances, even criminals have their convictions expunged from their record after sufficient time has passed. Through these and similar mechanisms, our society accepts that human beings evolve over time, that we have the capacity to learn from past experiences and adjust our behaviour.
The shift from forgetting to remembering is monumental, and, if left unaddressed, it may cause grave consequences. Such a future, however, is not inevitable.
It is not technology that forces us to remember. Technology facilitates the demise of forgetting – but only if we humans want it to. We can, equally, decide that we wish to reverse that change.
Retaining information in our digital memories has become the default of how we operate. It no longer requires a conscientious act, a tiny bit of time, energy, or money that we need to expend to commit information to digital memory. Digital forgetting, on the other hand, necessitates that extra quantum of human effort.
I suggest we reset this balance and make forgetting just a tiny bit easier: just enough to flip the default back to where it has been for millennia. We can aim to achieve this in many different ways, from the strengthening of privacy rights in law to digital rights management (DRM) and voluntary digital abstinence. While each one of these approaches is useful, no one approach alone will be sufficient to meet the challenge we face – and, given their shortcomings, we may have to add a further, novel one.
This new approach – a combination of raised human and societal awareness, technical tools, and supporting legislation – could make a significant difference. It is to set, as a matter of course, expiration dates for digital information. The idea is that we can mimic human forgetting by associating information we store in digital memory with expiration dates that users set. Our digital storage devices could be made to automatically delete information that has reached or exceeded its expiration date. In its most basic form, I believe this might be sufficient to reintroduce the contours of forgetting, although numerous more sophisticated variations are conceivable. Users, when saving a document they have created, would have to select an expiration date in addition to the document's name and location on their hard disk. They wouldn't be able to save the file without specifying an expiration date, just as they can't save a file without selecting a name for it.
Based on these preferences, the users' computers would do the rest: managing expiration dates and clearing out expired files, perhaps once a day. Of course, users would have a little software utility to change the dates in case they discovered information had lost its value earlier than expected, or remained important and useful beyond its originally envisioned lifespan. The paranoid might even have a tool that warned them whenever information was close to reaching its expiration date, so they could decide to adjust the date if they so desired. But by having our computers delete files that have reached the dates we set (much as we clean out foodstuff that has expired), we reintroduce forgetting into our daily routines and shift the default back from pervasive remembering to human-controlled forgetting.
The need to enter an expiration date should prompt users to reflect, for a moment, on the lifespan of the information they intend to store, but it should not annoy them with a cumbersome and complex user interface. Entering a date could be made easier by permitting users to choose a relative date (say, a month from today), as well as a carefully thought-through selection of presets and defaults, perhaps based on the file type, the selected file location, or even some (limited) semantic analysis of the file's content.
Technically, expiration dates would be relatively easy to implement. Our digital devices already manage and store an ever-increasing amount of meta-information. File names and creation and modification dates are three examples. Digital cameras, too, store meta-information with each picture taken, from shutter time and aperture to film speed. Our music library software manages meta-information for our music tracks, from name and band and genre to album art and usage rights for our music tracks.
Expiration dates would simply add another type of meta-information to digital memory: "information about information's life expectancy" if you wish. Much like other meta-information, the expiration date would be made to "stick" to the information it refers to, ensuring that if one copies a file, the expiration information is copied with it. In the background, a small software application would regularly clean out expired information. On many of our digital devices, the framework of such functionality is already present. For instance, our personal computers automatically clean our hard disks of auxiliary files no longer needed on a daily or weekly basis. Discarding expired files could be just one additional housekeeping task.
Ensuring the presence of the necessary technical architecture on the devices we use would require the help of the law. It would not be the first law to prescribe the functioning of software. Existing statutes already constrain software by requiring mechanisms or processes for the protection of information privacy or intellectual property, by compelling software manufacturers to conform to certain principles of security and reliability, or even mandating that software conforms to certain ergonomic principles. This would just add one more requirement and would be relatively simple to implement.
But the effect would be huge. Expiration dates would limit the amount of information companies – and even the government – could have available on consumers and citizens. Expiration dates would also remind us whenever we commit something to digital storage that most information is not timeless, but linked to a point in time, and thus gradually loses its value.
So far, I have described how an expiration date could work for simple information files. There are circumstances where a more fine-grained approach is preferable. Expiration dates could, for example, be used for search queries. Upon entering a search query, users could be prompted to input an expiration date, or could select one from suitable presets. Users searching at somebody else's computer, or performing a one-off search for a colleague or friend, for example, would select a short period, while those researching for their long-term interests would choose the opposite. Users would have to pay a tiny cost – the time it took to reflect, decide and click – but in return they would get vastly more control over their search history (and more relevant results in future searches), as well as reducing overall digital memory. Quantitatively, search sites would lose some of the search query information they use to fine-tune the ranking of search results, but the quality would probably improve as less-relevant past queries were omitted.
Clearly, easy-to-use interfaces would be needed, if expiration dates are to catch on. A quick thought and an extra mouse click or two should be all that are necessary. Designing such interfaces is a challenge, but not an overwhelming one.
Different societies will differ in their preferences over expiration dates, much as they differ over which information privacy and intellectual property laws they enact and enforce. Societies wanting only a minimum of protection against digital remembering may simply mandate that expiration dates must remain connected to that information. In such a system, the recipients of information would be free to alter the expiration date if not limited by contracts or other rules.
If a society wants to go beyond that minimum, lawmakers may mandate that expiration dates generally persist – that is, they can only be changed by the originator or with his or her explicit consent. Even under such a relatively strict system, credit bureaux and direct marketing database vendors will continue to operate, as will computerised travel reservation systems, but it may force them to think a bit harder about their information collection and retention policies. And government agencies may have to consider whether and when to delete citizen data, rather than automatically keeping everything stored.
In many instances, of course, information refers to more than one person. In an e-commerce transaction, for example, the information generated is linked to both buyer and seller. In such cases, a mechanism would be needed whereby an expiration date could be negotiated. In some cases, in principle, each transactional partner could determine independently the expiration date for information before committing it to digital memory; one would not need to be bound by the expiration date choice of the other partner(s). For a more stringent approach, a formal legal rule could mandate that expiration dates are to be set jointly, so that all parties involved have to use the same date for the same piece of information. In the context of Amazon's book purchase records, for example, Amazon could offer customers a date range from which they would pick a suitable date. Such negotiations are part of general transactional negotiations, and if transactional partners can't agree on an expiration date, there won't be a transaction. That may hurt vendors, too, not just consumers, especially in difficult economic times.
Expiration dates are not a one-size-fits-all solution. A whole range of options is conceivable, from a bare bones implementation of the core idea to more complex and far-reaching versions. Tensions will remain between an individual's desire to forget and a society's desire to remember (and vice versa).
In contrast to other methods for limiting the tyranny of comprehensive digital remembering, expiration dates offer a number of advantages. Unlike digital abstinence, they embrace participation in digital culture and global networks. In contrast to information privacy rights, they do not depend on individuals fighting costly and time-consuming battles in court. They may be more palatable politically than a comprehensive regulatory approach, and they are less controversial than intrusive digital rights management systems.
Most importantly, they introduce to the digital realm the default of forgetting that is so familiar in our analogue world, without relying on new rights or institutions. They may thus offer a valuable first step towards a more forgetting world, and towards gradually lifting the burden of comprehensive digital memory from our shoulders.
As humans, we do not travel ignorantly through time. With our capacity to remember, we are able to compare, to learn, and to experience time as change. Equally important is our ability to forget, to unburden ourselves from the shackles of our past, and to live in the present.
The choice is ours. Do we want a future that is forever unforgiving because it is unforgetting? If we had to worry that any information about us would be remembered for longer than we live, would we still express our views on matters of trivial gossip, share personal experiences, make various political comments – or would we self-censor? Perfect memory alters our behaviour.
And if all our past activities, transgressions or not, are always present, how can we disentangle ourselves from them in our thinking and decision-making? Might perfect remembering make us as unforgiving to ourselves as to others? In Jorge Luis Borges' short story, Funes, the Memorious, a young man, Funes, has – owing to a riding accident – lost his ability to forget. Through ferocious reading, he has amassed a huge memory of classic works in literature, but fails to see beyond the words. Once we have perfect memory, Borges suggests, we are no longer able to generalise and abstract, and so remain lost in the details of our past.
What Borges only hypothesised, we now know. Researchers have recently published the case of "AJ", a 41-year-old woman in California, who does not have the biological gift of forgetting. Since she was 11, she remembers practically every day – not in the sense of a day that passed, but in astonishing, agonising detail. She remembers what exactly she had for breakfast three decades ago; she recalls who called her and when, and what happened in each episode of the television shows she watched – in the 1980s. She does not have to think hard. Remembering is easy for AJ – her memory is "uncontrollable, and automatic" like a movie "that never stops". But she feels that, instead of bestowing her with a superb facility, her memory restricts her ability to decide.
Modern society faces a similar paradox. Too perfect a recall, even when it is benignly intended to aid our decision-making, may prompt us to become caught up in our memories, unable to leave our past behind, and much like Borges' Funes, incapable of abstract thoughts. This is the surprising curse of remembering.
The "brave new world" of comprehensive digital memory has made possible a comprehensive reconstruction of all our words and deeds, considered and ill-considered, even if they are long past. It has thus created not just a spatial but a temporal version of Jeremy Bentham's panopticon – a form of prison architecture imagined by the 19th-century philosopher in which guards could watch prisoners without prisoners knowing whether or not they were being watched (which, he argued, would force prisoners to behave). This in turn constrains our willingness to say what we mean, and even to engage in our society. Do we really want to live in a society of servility and fearfulness?
At the same time, we risk losing the important function that forgetting performs in our decision-making. It permits us to generalise and abstract from individual experiences. It enables us to accept that humans, like all life, change over time. It thus anchors us to the present, rather than keeping us tethered permanently to an ever-more irrelevant past.
Forgetting also empowers societies to be forgiving to their members, and to remain open to change. Digital remembering thus threatens us individually and as a society in our capacity to learn, to reason, and to act in time. It also exposes us to a potentially devastating human overreaction – a complete disregard of our past.
Whether or not expiration dates in particular, and forgetting in the digital age in general, can reverse these trends depends largely on us. Are we willing to take the necessary steps to implement them? If we are, it will require sustained debate to define the exact mechanisms and to build public support. It may even require a movement of sorts, much like the one to reform copyright laws.
But perhaps the first steps for such a movement are already under way. For example, in Argentina, writer Alejandro Tortolini and his colleague Enrique Quagliano have initiated a campaign to "reinvent forgetting on the internet". They have appeared on television, radio, and in print media, and continue to advance their ideas helped by surprisingly robust public support.
While this is heartening, much more remains to be done to increase public support and to establish expiration dates as a viable potential complement to other responses in our quest to revive forgetting. I hope that it is done. If it is, it will help to re-humanise our digital age.
Ask the author: Put a question to 'Delete' author Viktor Mayer-Schoenberger by emailing email@example.com (please put "Viktor" in the subject line). Twitter users can also send questions by tweeting with the hashtag "#Viktor". We will post a selection of Viktor's answers to readers' questions online next week at independent.co.uk/books.
Adapted with permission from "Delete: The Virtue of Forgetting in the Digital Age", by Viktor Mayer-Schönberger (Princeton University Press, £18.95). ©Princeton University Press 2009Reuse content