Aside from its ecommerce business, Amazon has spent years crafting plenty of proprietary technology. This includes plenty of Artificial Intelligence technology including Rekognition.
Rekognition is a video and picture analysis tool. It uses artificial intelligence to identify objects, text, and people in photos and videos. Amazon suggests the tool could be used to help identify celebrities in videos and link back to their IMDB page, for example.
Currently the technology is being used to help power sites like Pinterest and by social media influencer programmes like Open Influence.
However, Amazon are currently in the process of trying to sell Rekognition technology to US law enforcement. Marketing documents unearthed by the American Civil Liberties Union show Amazon is trying to sell the software to police forces in at least three states in America.
And this has caused some consternation among shareholders.
What has happened?
A letter signed by 19 Amazon shareholders demands Amazon stop marketing the tool towards law enforcement.
It follows a previous open letter penned by the ACLU which accuses Amazon of ‘powering a government surveillance infrastructure’ with the technology.
The ACLU paints an Orwellian portrait of the future of Rekognition. They argue the technology will be used to survey individuals constantly, tracking where people are at all times, and making databases of people without their consent.
In its letter, the ACLU writes:
“Amazon states that Rekognition can identify people in real-time by instantaneously searching databases containing tens of millions of faces. Amazon offers a “person tracking” feature that it says “makes investigation and monitoring of individuals easy and accurate” for “surveillance applications.” Amazon says Rekognition can be used to identify “all faces in group photos, crowded events, and public places such as airports”—at a time when Americans are joining public protests at unprecedented levels.”
In the ACLU’s letter it pleads with Amazon CEO Jeff Bezos to ‘take Rekognition off the table for governments’. Its reasoning is clear:
“People should be free to walk down the street without being watched by the government. Facial recognition in American communities threatens this freedom. In overpoliced communities of color, it could effectively eliminate it. The federal government could use this facial recognition technology to continuously track immigrants as they embark on new lives. Local police could use it to identify political protesters captured by officer body cameras.”
It also shared a petition that has been signed by around 60,000 people at the time of writing.
It didn’t take long for Amazon to reply to ACLU’s accusations. In an emailed statement, the company wrote:
“As a technology, Amazon Rekognition has many useful applications in the real world. Our quality of life would be much worse today if we outlawed new technology because some people could choose to abuse the technology. Imagine if customers couldn’t buy a computer because it was possible to use that computer for illegal purposes?”
Still, that didn’t do much to dissuade members of the ACLU from criticising Amazon selling the technology to law enforcement.
Google reveals its policy:
In addition to the criticism from the ACLU, Google stepped in to pointedly state its own policies around AI’s potential uses.
In a blog post CEO Sundar Pichai laid out his companies seven core principles which guide Google’s AI development. These are:
- Be socially beneficial.
- Avoid creating or reinforcing unfair bias.
- Be built and tested for safety.
- Be accountable to people.
- Incorporate privacy design principles.
- Uphold high standards of scientific excellence.
- Be made available for uses that accord with these principles.
Pichai also pledged that Google’s AI will not be used for weaponry or surveillance purposes. Obviously this is in stark contrast to Amazon’s position on the matter.
Amazon’s shareholders weigh in:
However, despite Amazon’s rebuttal of the criticism, its shareholders are not happy. This week 19 shareholders also wrote to Bezos urging him to prevent a roll out of the device until concerns were addressed.
The shareholders’ letter contains numerous reports of times where facial recognition technology has proved inaccurate, particularly when identifying people of colour. They note that US consumers may be put in harm’s way thanks to inaccurate facial recognition.
However, the shareholders also recognise more broadly how the technology could be used:
“We are also concerned sales may be expanded to foreign governments, including authoritarian regimes. Without protective policies in place, it seems inevitable the application of these technologies will result in Amazon’s Rekognition being used to identify and detain democracy advocates.Experience has shown repressive governments tend toward incarceration and torture of identified people who are opposing repressive practices, andthe surveillance technologies will tend to harden this circle of repression.”
Why it matters:
Of course, the shareholders’ interest in the matter is much more than purely philanthropic.
“The recent experience and scrutiny of Facebook demonstrates the degree to which these new issues may undermine company value as the detrimental impacts on society become clear.”
This line refers to Facebook’s recent Cambridge Analytica scandal. The scandal happened because Facebook shared the private data of millions of users to a firm which used the data to help manipulate election advertising. This saw millions wiped off Facebook’s market cap and user trust in the service reach an all-time low.
In addition, creator Mark Zuckerberg was hauled in front of lawmakers in several countries to answer for the situation.
While Facebook has since recovered, that recovery was by no means assured. Clearly Amazon’s shareholders are concerned that another scandal of that magnitude could have a serious negative impact on the company’s value.
As technology companies grow the personalisation and user recognition which made them popular with consumers will almost certainly become more and more of an issue. In a world where the consumers are the product, tech companies need to start taking more care with them.