In 2018, Canadian data scientist Christopher Wylie found himself at the heart of an international scandal after exposing that Cambridge Analytica harvested the data of 87 million Facebook users without consent, using that data to target voters with political ads.

After being employed by Cambridge Analytica as a contractor in 2013, Wylie became a whistleblower, working with the Guardian and Observer to lift the lid on the organisation, sparking arguably the largest data privacy scandal in history.

Credited with “setting up and then taking down” Cambridge Analytica, he is now an outspoken critic of social media platforms’ use of user data.

Speaking at Big Data LDN 2019, Wylie sheds light on how  Cambridge Analytica facilitated the mobilisation of the alt-right, and how this can be prevented from happening again.

The spread of radicalisation

Established in 2013, Wylie explains that the political consulting firm was originally involved in counter-radicalisation. However, the organisation later became involved in electoral processes:

“When I first started, we were doing work for the military. We were doing work on essentially counter-radicalisation and trying to map out how radical narratives spread online and who would be more vulnerable to that and what kind of intervention activities are warranted…that approach was essentially inverted.”

Although the exact impact of Cambridge Analytica on the outcome of political events such as the 2016 election of Donald Trump is unknown, Christopher Wylie explains that political ad targeting was only part of the picture:

“So Cambridge Analytica with its background of trying to understand young unmarried men from a particular community…that understanding was inverted. Rather than trying to identify the people and try to mitigate the spread of radicalisation on the internet, it takes that information for the spread of radicalisation on the internet it’s just that it’s now for something called the alt-right…they may not have a job, they may have particular life circumstances, and also they have a particular personality profile that makes them more likely to engage.

“So when you have the cohort…the ad targeting is step one. I think a lot of people don’t quite understand that. It wasn’t just ad targeting. The ads were just to get you to click on something. It was at that point where you click on something you join a group, a page, you go and join a forum.

“You see something, you click on it, you then see a forum, you join it and all of a sudden someone sends you a message, “Hey, check out this link”, you start chatting with people in the group and it turns out that they have very similar stresses, they have similar beliefs and you start going down the rabbit hole, clicking on more and more things, engaging more and more.”

He explains how this technique of psychographic profiling is particularly effective, as only a small number need to be affected for it to make a difference to an election outcome:

“When you are targeting a subset of the population, you aren’t targeting everybody. You are targeting like 2% of people…it was a narrow segment of the population. And the reason for that is elections are a zero-sum game. So, if you get one more vote, than the other person right you win 100%. Even if you get 51% of the vote you win 100%. Which means that narrow slivers of the population now have a higher degree of influence.”

3 Things That Will Change the World Today

“I do not have any trust in the company whatsoever”

Christopher Wylie believes that compared with other industries, the technology world has gone largely unregulated, allowing for platforms such as Facebook to only take action after events such as the Cambridge Analytica scandal:

“You can’t just design a building and go ‘well, if there’s a problem we’ll work it out as it happens. If there’s a fire we’ll reengineer that after the fire’… I have to think about what could go wrong and plan for that in my bluprints. And there’s a building code about what’s safe and not safe. In any of these platforms, if I’m an engineer I don’t have to think about it. I only have to think about it when the media points out there’s a problem and I have to react to it…Platforms like Facebook go ‘oh, this is very complicated, sometimes shit happens like mass shootings or maybe contributing to ethnic cleansing in Myanmar.’

He believes that the response from the likes of Facebook has so far been “semi-credible”. Although Facebook has paid a fine for its involvement in the Cambridge Analytica scandal, whether the company will suffer any major long-term consequences remains to be seen:

“Facebook continuously reacts at the point where they absolutely have to…and they do this when they have to issue a semi-credible press release to journalists. I do not have any trust in the company whatsoever.”

“You’re going to see a re-segregation of society”

Moving forward, Wylie explains that the increasing presence of artificial intelligence in our everyday lives could have serious implications for human autonomy:

“If every aspect of your life is monitored and then the physical environment around you becomes aware of you and has a set of decisions it can make about you…how do you exercise agency as a person when you are inside something that thinks about you and decides what you see or don’t see without your knowledge or consent?”

He believes that the effects of poorly deployed algorithms are already being felt through algorithmic bias:

“When you look at even what’s happening now when algorithmic bias is starting to influence how much credit you’re able to get…if you look at the United States where you can’t get a mortgage in certain areas if you’re a person of colour…because of how data are collected, and we look at facial recognition it underperforms for women of colour me because the data sets are predominently white straight men…all of a sudden you’re getting data sets that get released to a bank and all of a sudden people are getting denied a mortgage or they’re getting a lower credit score because of bias of the engineer that has created that data set. And if you amplify that bias everywhere you’re going to see a re-segregation of society.”

Is Christopher Wylie optimistic about the future? He believes that the increase in public awareness of the importance of data protection is a cause for optimism:

“I’m optimistic because if you talked to somebody on the street,or a journalist, in 2017, one year before Cambridge Analytica. And if you said data protection is a political issue, it would be laughable. Now it is. And now we’ve got people like Elizabeth Warren, and also all the main candidates talking about what they’re going to go in United States about people’s privacy and people’s data and about tech monopolies. So I’m optimistic because if you think about where we are now in 2019 and where we were in 2017 there’s quite a radical shift. And I think probably within the next ten years there’s going to be legislation in the Unites States. Finally, the only OEDC country that has no national privacy law. And if that law is crafted in a way, I think that we can start to reign in the really reckless behaviour of the tech industry.”

He also calls for a code of conduct or ethics to become “part and parcel” of the industry:

“It’s not going to just be regulation from the top…We also need change from within. when you look at other professions, whether you’re a doctor, lawyer, teacher, even an architect, there’s professional conduct standards and codes of ethics…I really think that if we actually started putting ethics at the heart of the data science profession or software, where it becomes part and parcel of what we do every day, I think that will make an impact. And so that’s something.”


Read More: Facebook agrees to pay £500k Cambridge Analytica fine.