For Facebook, trust is an essential commodity. It's behind every one of the interactions their 200 million global users has with the website every day, whether it means sharing images that you're happy for your friends to see but not your boss, sending personal messages or updating your status to show people you know what you're up to. This trust has been central to Facebook's meteoric rise ever since it launched to the insular environs of Harvard University in 2004, enshrined as the privacy controls which stand between everything you've uploaded to the site and the online public.
Running to five pages, the site's current iteration of privacy controls are a bewildering warren of clauses and sub-clauses, asking if you'd like to share with this network and that network, or whether you want to wade through lists of friends to decide who you deign worthy of being allowed to view your profile. It's hard to imagine situations where these are applicable - where you are happy with someone viewing your profile but not pictures with you in, for example - but the truth is that, like a city growing faster than it's planners can handle, options have been built on top of options, with sedimented layers of settings complicating the system past the point of usefulness.
This new wave of changes will be met with consternation from Facebook users (they always are), but after the series of scandals which has hit the social network recently, it's crucial they get them right. Hacked accounts, fraudulent advertisers, malicious applications and a bitter argument over the site's terms of service have all knocked the trust many users have in Facebook and the direction it's owners are taking it in. Facebook may have vanquished MySpace, but with Twitter hot on it's heels and the site still losing money, for its 24 year-old billionaire creator Mark Zuckerberg it's crunch time. As he knows, without trust, Facebook can't survive.