Fake news is not only a problem because it is fake. The big data gathered from it could be worse.
When people read and share falsified stories online they are more likely to do so if the piece confirms their existing beliefs. When people share this kind of story, social media companies remember.
And that data can be collected.
British parliamentarians have said they will launch a committee to look at the problem of fake news. But much damage has already been done.
This is Cambridge Analytica — it already knows you
On 3 December 2016, German-language title Das Magazin (Zurich) published an interview with Cambridge University academic and psychologist Michael Kosinski.
According to an English translation, from online blog Antidote Zine, Kosinski alleged that an academic colleague from the Psychometrics Centre at Cambridge University had replicated and sold his doctoral research to a then relatively unknown big data online marketing company, Cambridge Analytica.
Cambridge Analytica was commissioned by the lobby group campaigning for the UK to quit the European Union in 2015 and 2016, and latterly by Donald Trump’s presidential campaign (for an alleged set of fees totalling $15m).
How it’s done — measuring personality with old science and new
Since the 1980s scientists have developed models under the so-called psychometric OCEAN system, which broadly attempts to categorise a test subject under five principles via questioning: Openness (do they enjoy new experiences), Conscientiousness (do they adhere to plans and order), Extroversion (do they like spending time with others), Agreeableness (do they consider other’s needs before themselves), Neuroticism (do they worry a lot).
Kosinski’s research used personality psychometric data acquired through surveys, delivered through the OCEAN model.
When combined with data collated through voluntary surveys and social media, Kosinski and his fellow researchers were able to predict with great accuracy what choices different types of people would make.
After he published his research Kosinski received two calls from social media giant Facebook. The first was with a threat to sue for using their site
The second call was to offer him a job.
According to the Das Magazin/Antidote Zine articles, it was then that Kosinski was approached by SCL, Strategic Communication Laboratories, a holding business for Cambridge Analytica, of which Trump chief strategist and former executive chair of Breitbart News Steve Bannon is a board member.
Kosinski turned them down, fearing the power of his research could be used to influence political elections (SCL marketed ‘election management’ as one of its key selling points).
When the then Ukip leader Nigel Farage announced that the Brexit Leave campaign had signed up the services of Cambridge Analytica, Kosinski saw a lot of the thinking behind his research on the Cambridge Analytica site.
He began to worry that his research had been replicated and sold on by another academic who was working in the faculty at the time.
In November of 2016 Donald Trump predicted his victory in the US election would be “Brexit times 10″. In terms of targeted social media advertising he may have been right.
In previous Guardian newspaper articles the revitalising effect that Cambridge Analytica’s techniques had on the faltering campaign of Ted Cruz has been highlighted, but it was Cambridge Analytica’s ‘microtargeting’ of user data in the Trump campaign (and Trump’s ultimately successful sledging of his congressional competition) that secured his nomination, according to the insight from Kosinski.
Cambridge Analytica used the psychometric approach, combined with collated big data from social media, smartphone data and an array of other metrics, to launch targeted advertising on social media to highlight issues and strengthen Trump voters’ conviction in the election run-up, while also targeting fringe voters and sceptical leftists to suppress their vote.
Whilst the mass surveillance and murky claims surrounding the business that supported the two biggest campaigns of 2016 should cause concern, the progression of big data as a tool for intelligence is unquestionable; yet the question of what forms good ethical guidelines looms in the background.
From rising populism and Russia’s alleged willingness to take an active hand in countries elections this is not a problem that is likely to disappear.
Despite the efforts of social media firms and politicians to limit the effect of targeted advertising, the more we share and the more we link; the more we reveal our own confirmation bias weaknesses.