Facebook, Cambridge Analytica, and your data – a look back one year after the personal data scandal
Last Sunday marked the first anniversary of newspapers breaking the story of Cambridge Analytica. The Guardian and the New York Times revealed that the British political consulting firm had been able to harvest the personal data of millions of Facebook users without permission.
Cambridge Analytica had to file for insolvency just a few months after the scandal, but most of the outrage fell on Facebook, which was dragged over the coals for allowing this to happen.
The social media company’s founder and chief executive Mark Zuckerberg was hauled in front of politicians in the US and Europe, and Facebook has been working relentlessly to fix its reputation and restore public trust.
Read more: Facebook to appeal £500,000 fine over Cambridge Analytica scandal
The ramifications from the scandal are still being felt. To mark the occasion, I asked several tech experts to weigh in on the incident, and how it has affected not just Facebook, but the entire tech industry.
What actually happened?
Using a survey app called “This Is Your Digital Life”, Cambridge Analytica got consent to access data from hundreds of thousands of Facebook users.
But because of the social media site’s design, the firm was able to harvest information from those users’ entire network of friends – without permission from these other people. Facebook estimates that Cambridge Analytica ended up with data from 87m profiles.
According to the New York Times, this data contained enough detail to create psychological profiles of users, which could be used to suggest the kind of adverts they would be most susceptible to.
It’s alleged that this data was used to target voters in political campaigns, including the Brexit referendum and the 2016 US presidential election.
Note that this wasn’t technically a data breach. Instead, it was a case of “bulk data sharing”, says Ben Lorica, chief data scientist at O’Reilly Media.
“This came down to a lack of disclosure,” he explains. “Not only was user data made available through a programme for Facebook developers, but copies of the data were stored in the hands of programme participants. Enter Cambridge Analytica, which had easy access to large volumes of data that users weren’t even aware that they had given away.”
How did Facebook respond?
The company took a big hit from the scandal. Its share price dropped, Zuckerberg was grilled by politicians in a US Senate hearing, and Facebook has faced sanctions, including a £500,000 fine from the UK Information Commissioner’s Office.
While financial penalties may not dent its profits much, it has become the poster-child for data mishandling, facing constant criticism over the past year for failing to protect users’ data.
Zuckerberg’s latest face-saving move was an announcement this month that Facebook was going to pivot towards privacy and private conversations, and encourage users away from making public broadcasts.
However, this new direction has also come under criticism.
“What does this mean coming from Zuckerberg who acknowledges that ‘we don’t currently have a strong reputation for building privacy protective services’?” asks Audra Simons, the director of Forcepoint Innovation Labs.
She adds that, while the statement shows Facebook recognises the importance of data privacy and implies that it is changing, she doubts that it will sufficiently alter its business model – based on collecting personal data to help brands target users with ads – to allay the fears of those concerned about their privacy.
What about regulations?
Over the past year, new rules have been rolled out in several countries and US states, including the EU’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act – although the process for their implementation had begun well before the Cambridge Analytica news broke.
The scandal has drawn attention to the need for further regulations to make it much harder for tech firms like Facebook to harness user data.
“The Cambridge Analytica scandal has drawn worldwide attention to business models built on personal data,” says Duncan Brown, chief security strategist (EMEA) for Forcepoint.
“Combined with the reality of GDPR, we’re beginning to see worldwide impact. Businesses may not realise it yet, but international regulatory frameworks are beginning to shift towards the European standards.
“This means that, whether Facebook has learnt its lesson or not, regulation is coming. All social media and data aggregator firms will feel the impact.”
Has public trust declined?
The incident also raised awareness about the issue of personal data and privacy in general. As a result, consumer trust in tech companies like Facebook has declined sharply.
“The Cambridge Analytica scandal catapulted the value of personal data to the top of the news agenda, with many consumers evaluating the importance of their online presence for the first time,” says Derek Roga, chief executive of Equiis Technologies.
“They became aware that, essentially, they are the by-product of any free service they use, and are exchanging their data for the functions they use for free, such as Facebook.”
This awareness has led to renewed calls from both users and regulators for greater control over how companies can use personal data, and for more transparency about how it is collected, according to Lorica.
“It’s not just users fearful of falling victim to an affair like this who are beginning to change their attitude towards data sharing, but companies too,” he claims. “More and more companies are signalling that they take data privacy and the concerns of their users and regulators seriously.
“The stance companies take on data privacy and monetisation is becoming a competitive angle for some small and large firms. Apple is a great example of a company raising the bar on data privacy and collection, with more companies beginning to follow suit.”
But despite this greater awareness about personal data, consumers’ attitudes and behaviour haven’t fundamentally changed, argues Alan EJ Jones, chief executive and co-founder of private messaging app Yeo.
“Users continue to support Facebook with our daily patronage to other people’s La La Land, and Facebook in return continues to capitalise through mining our data,” he says. “One wonders what it will take for behavioural change among the 2.3bn Facebook customers to truly force change.”
Jones adds that, unless people break their habit of using Facebook’s umbrella of apps – such as WhatsApp, Messenger and Instagram – in order to damage the company’s revenue model, we’ll never see it change its behaviour.
Fast-forward
The Cambridge Analytica scandal was a shock to the tech industry. It forced many companies to re-evaluate their data and privacy policies in reaction to increased regulations and demands from privacy-concerned consumers.
Read more: UK data watchdog calls for 'significant changes' at Facebook
Facebook ended up taking most of the flak from the incident, perhaps unfairly – after all, it was Cambridge Analytica which harvested the data. But it was the social media platform’s responsibility to protect its users’ information, and it has since come under immense political and economic pressure to change its ways.
How earnest and effective this pivot to privacy will be remains to be seen – Facebook has built a business empire through utilising personal data, so it will be a challenge to significantly change its model overnight.
But if Facebook doesn’t change its ways, and gets hit by another data scandal of the same magnitude as last year’s, that might prove to be the final nail in the coffin of its plans for world domination.