Every time you log on to Facebook you’re given opportunities to share information about yourself; from status updates to biographical info, the social network wants to know about you. However, a research paper recently published by the site has revealed that Facebook also records what you don’t share.
They refer to this information as ‘self-censorship’ – the times when users start typing out a status update only to read back what they’ve written and think, actually, no, they don’t want to share that.
The collection of this sort of information has been highlighted before (last year it was noted that Facebook logs the friend requests you reject as well as those you accept) and for the engineers who run the site the aim is clear: if they can work out why you chose not to share maybe they can persuade you to hit that ‘Post’ button in the future.
Data scientist Adam Kramer and PhD student and Facebook intern Sauvik Das produced the study, collecting information from 3.9 million users over 17 days. An act of self-censorship was logged when an update of more than five characters was typed out but not shared within 10 minutes. Kramer and Das stress that they recorded this data only as a yes/no result - they didn’t log either “keystrokes or content”.
The results of the study showed that self-censorship is extremely common (71 per cent of users practice it) and that the “perceived audience” is what stays users’ hands the most often: status updates and posts to groups were more frequently censored than comments, and that people with more “politically and age diverse friends” censor themselves less.
This latter point suggests that the psychological phenomenon of groupthink might thrive on Facebook: if you’re speaking to an audience comprised of similar individuals you’re more likely to refrain from posting out of fear of criticism; if you have a more diverse array of friends you might be more used to that sort of rough and tumble.
However, a diverse audience might just as likely lead to the posting of what Kramer and Das term “lowest common denominator content” – information or opinions that are palatable to a wide array of people – and in general they found that “people censor more when their audience is hard to define”.
The general results of this study are not surprising - in fact they might be quite heartening for tech-sceptics who think that sites like Facebook have led to entirely thoughtless sharing - but for Facebook, it's clear that more information needs to be mined.
The study concludes: "Through this work, we have arrived at a better understanding of how and where self-censorship manifests on social media; next, we will need to better understand what and why."