In the unfolding Cambridge Analytica scandal, opinion varies wildly over what the firm actually did.
Some commentators claim that the company had granular data on more than 50 million Facebook users – their likes and dislikes, age, gender, location, education level and more – from which it built psychological profiles and delivered finely targeted political adverts.
Timeline for US tech giants
- April 1, 2020
- February 24, 2020
The much-touted concern is that these campaigns were powerful enough to swing the US election in President Donald Trump’s favour.
Others believe that Cambridge Analytica’s success was the result of bravado and a snake-oil sales technique, and that its product and the data it used weren’t advanced or groundbreaking.
Alexander Kogan, the academic who harvested Facebook data and sold it to Cambridge Analytica, said himself that the dataset was more likely to harm the Trump campaign than help it.
At the centre of the row is a deep confusion about the data that people generate every day, both on and off social media, and the privacy they can expect to have in relation to it.
What data have we consented, implicitly or explicitly, to share? Who is it shared with, and why? What safeguards are in place?
These questions are at the heart of the work done by Privacy International, a London-based charity focusing on the intersection of modern technologies and human rights.
Frederike Kaltheuner leads Privacy International’s Data Programme. She said that “perhaps one of the most pressing civil liberties issues of our time is the regulation of personal data”.
According to Kaltheuner, the use of personal data in targeted advertising has evolved into a “highly complex ecosystem, made up of thousands of companies that track and profile people 24 hours a day”.
Cambridge Analytica is just one example of a company that exploits the system for political campaigning, but the ecosystem also extends into issues such as credit scoring, hiring processes, policing, and insurance.
To Kaltheuner, one of the most troubling aspects of this environment is how little consumers are made aware of companies’ use of data.
The State of Technology This Week
Facebook uses data in ways that users don’t fully understand and that is inherently problematic. It’s their business model to sell targeted ads. From a privacy perspective that is problematic because it means that the interests of companies and the users are not necessarily aligned.
It also raises important questions about human dignity and autonomy. There is a massive power imbalance between companies and people – and that is simply not healthy for any democracy.
Throughout Europe, the use and spread of personal data will soon be more closely regulated, with the introduction of the General Data Protection Regulation (GDPR) on 25 May.
According to Kaltheuner, the GDPR will “give people more rights, place stronger obligations on companies and give regulators more power”.
However, after Britain finally leaves the EU, it will not be beholden to the GDPR. Data protection in the UK will instead be regulated by the new Data Protection Bill, which is currently making its way through Parliament.
One aim of the Data Protection Bill is to create domestic rules on data sufficiently close to those in the GDPR so as not to restrict the flow of information between Britain and Europe and vice versa.
However, the bill also diverges from the GDPR on several points.
The most contentious aspects of the Data Protection Bill are exemptions to the data protection regime on national security grounds, and the possibility for automated decisions in areas such as policing and immigration.
The national security certification regime in the Bill allows a minister to sign a certificate removing all data protection rights from individuals for national security or defence purposes.
Automated data-decision making and the right to a fair trial
Griff Ferris is the legal and policy officer for Big Brother Watch, a non-profit civil liberties and campaigning organisation.
Talking about this exemption, Ferris said:
There is no oversight of this process, by a judge or any other person, the Minister has no obligation to consider whether this removal of people’s key data protection rights is necessary or proportionate, and these certificates can be indefinite.
The Data Protection Bill may also permit fully automated decisions – decisions taken by computer programs through algorithms, machine-learning or artificial intelligence – without human input in areas like policing.
Law enforcement in the UK has already started using automated decision-making systems in some areas.
Research by Big Brother Watch shows that the Metropolitan Police and forces in Kent, Greater Manchester, the West Midlands and West Yorkshire have trialled a commercial form of predictive policing, in which an algorithm predicts the areas where crimes are most likely to be committed and helps police allocate resources accordingly.
Meanwhile, Durham Police use an algorithm to examine information about suspects and their likelihood to reoffend, ultimately deciding whether they should be kept in custody or granted bail.
The Metropolitan Police have also used facial recognition cameras at Notting Hill Carnival, despite what Ferris describes as “similar technology showing a disturbing likelihood to misidentify black faces”.
Both Privacy International and Big Brother Watch are seeking amendments to the Bill on the subject of automated decision-making, believing it to impinge on a person’s right to liberty and (in policing) on their right to a fair trial.
Our amendment would allow a person to have the right not to be subject to such an automated decision.
Another exemption in the Data Protection Bill allows the majority of people’s data protection rights to be restricted when doing so is thought to be beneficial for immigration control.
This exemption is extremely broad and far-reaching, removing many crucial data protection rights such as the right to access your data, as well as removing the obligation that such data must be processed in a manner that is lawful, fair, and transparent, is accurate, and is only used for explicit and legitimate purposes.
This exemption is completely unnecessary and disproportionate as there are already exemptions from data protection rights for ‘criminal’ purposes in the Bill, which includes immigration offences.
He notes that the exemption is undefined and could potentially affect people working with immigration services or charities, as well as EU nationals.
A previous attempt to introduce this exemption in the 1983 Data Protection Act was deemed a “fraud on the public”.