The Great Hack: What has Cambridge Analytica shown us about data ethics?
New Netflix documentary The Great Hack explores how data company Cambridge Analytica harvested millions of Facebook users’ data without their consent, then micro-targeted them with persuasive messaging to swing the US election and the EU referendum in the UK.
It raises many questions on the use of data and digital technology to sway voters. Profusion’s head of data science Henrik Nordmark joined Louise Scott for a discussion
What’s your initial response to the film? What do you think of it?
It’s disturbing – mostly because the questions it raises feel unresolved. And it’s unclear what we as a society ought to do.
Also it sounds like the people running Cambridge Analytica thought there was nothing wrong with that they did, that they were the victims. The ICO found them guilty of not fulfilling US professor David Carroll’s request for his own data. But that doesn’t take into account the bigger picture – was what they were doing illegal in some way? This wasn’t clearly stated.
The point was made that British electoral law is not up to date with the 21st century. This leaves us with the feeling that although we might find Cambridge Analytica’s tactics ethically or morally repugnant, they may not have actually broken any laws. I have read elsewhere that the data Cambridge Analytica acquired from Facebook was provided under the assumption that it was being used for academic purposes. So the breach in contract may have been that they misrepresented what the data was being used for.
Also if Cambridge Analytica had not gone into liquidation and had actually provided David Carroll’s data, what else could have been found?
It’s interesting to think about what would happen if the public at large knew the extent of what data is held on them …
Certainly public awareness helps, and documentaries like this help raise that and show how creepy this can become. It also shows the limitations of current laws. In the podcast How to Meddle in an Election [digital strategist] David Goldstein tries to replicate what Cambridge Analytica did, but in the confines of the law. Even with a tiny budget of £85,000 dollars, he is able to swing the Senate election in Alabama. That’s scary.
As horrified as we might be, there are no laws in place to stop any of this.
Shoshana Zuboff, a professor at Harvard Business School and author of Surveillance Capitalism has said: “The whole point is that we just don’t know what really happened. There has been no forensic analysis. It’s not because the data doesn’t exist – it does. It’s that Facebook is sitting on it. That’s why we need the law. What they want is for this to be obfuscated and so difficult to get to the bottom of it that we just fade into this kind of haze of “Well it seemed really bad but we just don’t know, maybe it was just overblown and doubt then fills the void’.
What are your thoughts on this?
We should be trying to press harder to find out what happened and, if necessary, change the law and write new laws giving the authorities the power to investigate this at a deeper level. No one can be taken to court for breaking laws that didn’t exist, but we can at least have laws that allow us to figure out what did happen. New laws could help us put some safeguards around this extremely powerful electoral influencing.
At some level it feels like an arms race for political manipulation. Looking at the last US presidential election, you could say the Republicans got the upper hand [by working with Cambridge Analytica]. But by the time the next election comes around in 2020, there will be a more level playing field because the Democrats will be able to do exactly the same thing. But is that really what we want – one team of data scientists fighting another team of data scientists? So whoever has the best team of data scientists wins the election?
It may not be a million miles away from traditional electioneering but it feels so much more extreme when it’s powered in this way and people are unaware of how hyper-targeted they are. At least if you’re aware this is going on, you might be able to be psychologically inoculated against it. Otherwise you will click on those links to websites tailor-made to influence you to behave in a certain way.
Yes. David Goldstein said: “If I only need to change 3% of people to affect a given result, I can go 97 people down and not have an effect. But as long as I have an effect on one, two, three, and only on this one specific action … then I can literally change the world”.
The Great Hack showed how, to get Trump, elected it was necessary to influence just 70,000 people in two states …
If you can become really good at identifying who is persuadable and hyper-target them, that’s like a level of surgical precision in how you do your influencing. It’s also about simple repetition of succinct messaging. With the Vote Leave campaign, Dominic Cummings came up with the slogan: ‘Take back control’, which feels psychologically empowering and is more likely to get results – especially among people who have been feeling disenfranchised.
How different is it from what we had before with traditional campaigns?
Previously, everyone saw the same TV ads from both sides of the political spectrum. But with micro-targeting, each individual gets encapsulated into a little bubble that has been specifically designed for them, where everything points them in a certain direction. And they might not even realise it. Meanwhile, another person is in their own little bubble which is totally different. The two might not even be able to agree on the facts because their sources of information are feeding them very different views of reality.
To what extent do you think Cambridge Analytica hacked people’s minds (unethically)?
At some level I think they did. Even if they didn’t manage to hack every single person’s brain or psychology, if they are able to influence enough people then that counts. On the other hand, one could argue that people had free will, nobody forced them to go to the ballot box or not, or vote for a party. Nevertheless, some of the techniques being used by Cambridge Analytica were deemed by the UK’s Ministry of Defence to be a form of psychological warfare – ‘psy-ops’ as they call it – and that shouldn’t be used on a civilian population. Some of these techniques have been used in places like Afghanistan to change the minds of people. If you are using tactics from warfare in civil society, that in itself seems scary.
In the film it’s suggested that Silicon Valley’s larger privacy problems made a psychological warfare campaign inevitable. Julian Wheatland, former COO of Cambridge Analytica, said: “There was always going to be a [scandal like] Cambridge Analytica. It just sucks for me that it was Cambridge Analytica …
It’s a horrible cop out. Even if at some level it’s true that’s no excuse for engaging in unethical behaviour.
To what extent can decisions of free will be based only on the information that’s available and the validity of that information? It could also depend on what information a person seeks out for themselves or whether they receive it without asking for it.
Yes maybe that is where it crosses a line. By being fed a possibly skewed view of reality they are essentially deprived of their true freedom.
In The Great Hack, we see the guy interrogating Carole Cadwalladr [the journalist who exposed the scandal in 2018] asking her if she is trying to come up with an excuse to overturn the EU referendum result. She says it has nothing to do with that – of course it may have impacted the result and that is not good, but it is part of our democratic process to make sure this is sorted out.
To what extent do you think Cambridge Analytica got Trump elected?
There is no doubt in my mind that it was a big contributing factor. I think he pretty much did get elected thanks to the smarter targeting.
Do you think it should invalidate the election result? Was it effectively cheating?
That’s what I am not sure about. There is nothing in the constitution or electoral law that makes that level of micro-targeting illegal so as long as the data was acquired in a legal way, no laws were broken so the result should be respected. The question is do we now want to prevent that level of micro-targeting in elections?
Can we stop it?
I don’t know. You could say let’s pass a law that forbids these political campaigns to use micro-targeting, data scientists and these kinds of machine learning algorithms. But whether it’s possible to enforce that is not entirely clear. With large corporations like Facebook and Google, it might be possible to get them to comply. But it has become easy for one person or group of people to do this.
Even if you wanted to make this illegal, it is completely unenforceable. So you have to think about not how to make this illegal but how to mitigate the more nefarious consequences of having this data. Maybe it is just public awareness. It is very analogous to the public health situation where you can’t force people to stop smoking but you can do a lot to raise the awareness of the general population about the dangers of certain activities and then it’s down to individuals to make a choice.
I know that the info I am getting is biased but I am still intrigued enough to look at what they have to say to me.
To what extent do you think Cambridge Analytica orchestrated Brexit?
It’s true that it played a decisive role.
To what extent did Cambridge Analytica change the course of history and impact the whole world?
The UK leaving the EU is something the UK will have to live with for decades to come. Even if Brexit was reversed, it would still be highly problematic because the country has become so polarised over the issues. So in some sense the damage that has been done to the UK and Europe is harsher and deeper than the effect [of Cambridge Analytica] in the US. The US is a more powerful country with a stronger impact on everyone on the planet, but Brexit will probably have longer-lasting negative consequences.
Since Cambridge Analytica was exposed in 2018, nothing has changed to resolve the issues. Electoral law hasn’t changed. In Europe, GDPR helps address some issues around data in general but it wasn’t designed with elections specifically in mind. The film leaves us on a cliffhanger.
In her TED talk, Carole Cadwalladr asked if we can ever have a fair election again anywhere in the world. Do you agree? Do the methods used by Cambridge Analytica pose a threat to democracy?
Interesting question. It depends on how you interpret the word ‘fair’. A simplistic answer could be that the Vote Leave campaigners and the Republicans got the upper hands temporarily as they hired Cambridge Analytica, but by the next elections there will be a Cambridge Analytica equivalent working for the other side of the political spectrum. If that is your criterion for fair, that both sides have access to the same weapons, then yes we will have fair elections again. But I don’t think we should set the bar that low.
To use a sports metaphor, imagine we have a fencing match and both sides are using identical sabres, then all of a sudden someone comes with an AK47. One way of making this a fair sport again is to give both sides an AK47, but I’m not sure if that is the kind of sport we want to be playing.
There have been calls to break up the big tech companies. What do you think?
It’s not entirely clear what benefits would come from breaking up Google, Facebook or Amazon. Would having two or five Googles be better than one? Each competing for data and on how good they are with privacy and the like? I don’t think that’s an immediate solution to the issues raised in The Great Hack. If we did break up the big tech companies, we would still be facing the same kinds of questions.
What about the statement “data rights are human rights”. Should everyone have the right to request the data that is held on them?
I very much feel that should be the case. Now that all this data infrastructure exists, it’s important to think about whether there are any new human rights we need to invent or discover …
What do YOU think? Let us know in the comments panel below and join the debate.
Like this content?
For more on data privacy, explore our Fusion blog: