Working as I do in the research, data and insights industry, the subject of not only data privacy but public awareness around data collection is something we talk about a lot. For many of us in market research the concepts of consent and transparency are hard-wired into us. Traditionally, in speaking to the public face to face, or on the telephone, the consent was implicit. As insight data-gathering moved (and grew) online, the industry has been careful to build in codes and guidelines, to ensure consumers are protected and quality standards are maintained.
But in our digitally connected world our every move is being collected somewhere. As Clive Humby, architect of the Tesco Clubcard, said “data is the new oil”. That’s why Twitter is worth billions when it never turns a profit – 500 million tweets a day is a firehose of data of people’s opinions, like and dislikes, on a previously unprecedented scale.
So this glut of information about us is now in the hands of companies in the tech sector and consent is limited to a tick box and a link to a multi-page small-print document that no one ever reads. More often than not, the questionable data practices that make the news are from companies outside the research and insight industry.
For me the most worrying element of the Cambridge Analytica data scandal is whistle-blower Christopher Wylie’s claim “I assumed it was entirely legal and above board”. For anyone working in the research and insights industry this kind of data collection is clearly illegal. The industry has fought hard to keep self-regulation. There are of course checks and balances; data protection is regulated, in the UK this is carried out by the ICO (information commissioner’s office), and most research and data collection companies are members of long-standing trade associations that compel them to sign up to a set of ethical codes, to safeguard public privacy and ensure consent, which is fundamental to maintaining public confidence in the research profession.
If proven true, and the ICO finds evidence of wrong-doing by the Cambridge Analytica and Facebook, you can expect far reaching repercussions. People understand now, more than ever, the value and power of their personal data. The Cambridge Analytica case will now demonstrate the extent to which (supposedly) private data can be used to manipulate them.
But for the public and lawmakers alike, data collection is data collection, and a data breach is a data breach. So even if those in the research industry can see a clear and unequivocal difference between the Cambridge Analytica profile scraping – or the exposure of the personal data of 57 million Uber users back in 2016 – and the insights profession diligently regulated systems, lawmakers and the public do not. The question is, should they? This miscomprehension is indicative of a staggering disconnect between the ethical and legal considerations of some of those involved in new tech, and everyone else. A recent report this year from the Market Research Society on consumer trust in the UK claimed that the security of personal data was (and is) the biggest driver of consumer trust in the UK, and 51 per cent of consumers do not want companies to use any of their data at all.
The research and insight industry has a long and successful track record in self-regulation, and will continue to fight for that status, in order to clearly distinguish itself from less ethical practices. The research and insights profession is worth over $6.6 bn in the UK alone in 2017. Market and opinion research is used to guide the creation of new policies and laws; charity and public opinion research aims to (eg) help refugees have a voice, and social research helps us to (eg) understand and support the lives of the LGBTQ community in countries with under-developed social legislation or particular social mores.
An un-regulated data collection industry potentially stifles all that good (essential) work and makes it harder for law abiding and ethical researchers to understand and reach people, and help governments, societies and businesses make more pertinent decisions. If companies continue to (ab)use personal data as a commodity, forgetting that they are really dealing with the lives of people, then the call for regulation will increase.
It is vital that companies in the emerging tech space look to engage with the checks and balances that exist in data protection and the research and insight space, and not just engage, but also contribute by the demonstrable provision of privacy standards, respect and consent to their customers/users. Not only will that ensure that the consumer information they hold can be used, but it will also radically improve the trust these companies are currently haemorrhaging, and safeguard their ability to innovate further into the future, with other data types.
Finn Raben is the Director General of ESOMAR, a member-based industry body that represents a global community of 40,000 research and insights professionals.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies