The algorithms underpinning artificial intelligence (AI) systems have increasingly been shown to reflect the biases of their designers. Artist and game maker A.M. Darke is creating a system based on their own prejudices to highlight the problem of AI bias, calling for programmers to be held accountable for algorithms that govern everything from credit ratings to criminal convictions.

The art project has been commissioned by the Open Data Institute (ODI), which has made Darke its research and development artist-in-residence. Darke is writing an algorithm that is overtly biased against the demographic predominantly designing the algorithms influencing our lives: white men. The ongoing project seeks to flip the usual narrative, in which the inherent biases of this demographic are unwittingly reflected in the AI systems they build.

Such algorithms have led to real-world consequences for marginalised groups. In January this year, an African American man was wrongfully arrested after a facial recognition system falsely matched his photo with security footage of a shoplifter. In 2018, Amazon ditched a recruitment algorithm that discriminated against women because it was trained on datasets of CVs predominantly submitted by men.

“I’m often trying to bridge the gap between how a dominant culture views black culture and make this connection and use that to interrogate a certain kind of politics of oppression,” Darke said, speaking at the ODI’s recent Data Futures event. “I think with this commission it was different because I was talking to people who already were aware of the problems.”

Darke aims to use their art to get the audience to feel complicit and responsible in the hope they are spurred to take action. Instead of focusing on who is harmed by how we use data and over-rely on algorithms, they wanted to highlight the people who were part of building the oppressive systems and structures that lead to AI bias.

“In my work, I try to avoid taking marginalised experiences and then serving them on a platter for a more privileged audience,” said Darke, who is assistant professor of digital arts and new media, and critical race and ethnic studies at UC Santa Cruz.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

“We tend to feel a certain amount of guilt and then we feel good about ourselves for feeling bad and then we don’t do anything. So highlighting people who are building these systems in a ‘just following orders’ way, not necessarily these large tech icons, who we think of is all-powerful, but the engineers and people working in content moderation, people working on policy just like everyday people who are building these structures that are deeply harmful.”

“I wanted to create a really biased and belligerent algorithm”

Darke looked at whose data is freely bought and sold, not only users who opt in but people who don’t consent. For example, many open sets for facial recognition are based on pre-conviction mug shots on individuals who may not end up with a criminal record.

“I wanted to expose and reveal that and create a really biased and belligerent algorithm,” said Darke. “The way I described the commission was, why should a few hundred mostly white, mostly men dictate the procedures that bind us and create or limit our agency in the world? Why not just me, why not just one queer, black femme who’s kind of loud and rude and you don’t like, why not me? And that sounds so over the top but I’m like, with a commission I want it to be transparent. It would make the same kind of snap judgments these algorithms do, but I would be clear about them.”

Darke said a lawyer told them “it’s easier to be legal than challenging”, which they took to mean in some ways it’s easy to do unethical things that conform to the rule of law than it is to do something that is challenging and actually for social good.

Darke explained: “I wanted to use people’s public profiles from social media sites from spaces like LinkedIn and say, okay, this person works here and this is what they do, they work with policy to coordinate and cooperate with law enforcement. And to me, I look at that and I say, well, not only have you made this information public but you’ve done so to advertise what you do so that you can benefit, make more money get new positions, and I look at that as fair game.

“What I wanted to do was take factual information, compile that into the identity sets, and then do a subjective, knee-jerk superficial analysis of, say, how likely you are to be a ‘Karen’, or want to speak to the manager, how likely you are to call the police on a brown person in your neighbourhood. This very biased lens; basically, an algorithm that, instead of criminalising me as a person of colour in the United States, which is this particular context, but thinking about what algorithms are useful.”

AI bias: “Your body is bought and sold again as data”

The other issue Darke aimed to address was the responsibility for sharing and using that data. A significant barrier to that is the overwhelming amount of information individuals are faced with. People sign up to social media platforms without reading the detailed terms of service about how their data could be used. Conversely, they believe that platforms should come with credits for everyone who worked on them.

“I do think it’s important that we as individuals and society – whether we’re engaging with these systems or building these systems – think about our own agency and the actual harm they do so that we can make different choices. But I also think that this is a systemic issue; after climate change, we have to think about [data abuse] as an oppressive human rights issue.”

Darke referred to a conversation with fellow data artist Kyle McDonald about how, in some of the mugshots used to train AI systems, some of the people are no longer alive, but their data is in the system forever.

“If you think about how black bodies, in particular, have been criminalised in the United States, it’s like slavery is justified but only if you are incarcerated,” they said. “Thinking about the prison industrial complex, your body is bought and sold again as data; you just have no agency. If we think about data as an extension of our bodies, then we can think about this in a different, more systemic way.”


Read more:  Half of UK data leaders say their company data risks ethnic bias