Facebook has confirmed that it now ranks some of its users on a trustworthiness scale, prompting comparisons to China’s Social Credit System.
The social media giant reportedly assigns users with a reputation score between zero and one. According to the Washington Post, who broke the story, Facebook has been developing the system over the past year.
Timeline for US tech giants
- December 11, 2018
It forms part of a wider crackdown on fake news and misinformation, a scheme first started by Facebook in 2016.
The Facebook rating score has been developed to combat users attempting to game the system by flagging posts that they do not agree with as untrue.
It’s “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher,” Tessa Lyons, Facebook’s product manager in charge of fighting misinformation, told the Washington Post.
The revelation has drawn widespread criticism. Matthew McKenna, VP EMEA at SecurityScorecard, described the news as “concerning”. He called for greater transparency about how the score is calculated and for users to be able to view their score to improve it.
“For any scoring or rating system to be effective, whether for business or people, these foundational principles are a must,” he said.
Made in China
Morten Brogger, CEO of encrypted communications platform Wire, compared it to a social rating system being developed in China that makes use of big data analytics.
“Refusing to be on a level playing field with customers in this manner is reminiscent of a proposed Chinese Social Credit System, which will give people a ‘social credit score’ based on their interactions with digital applications,” he said.
“Whilst this method is state-run in China, it is now being privatised in the West.”
While the Social Credit System is planned to be fully implemented by 2020, there are already some pilot systems in place.
Most of these involve financial credit ratings, but apps such as Honest Shanghai go much further. The app uses facial recognition and national ID numbers to gather data and assign them a public credit score.
3 Things That Will Change the World Today
According to Shanghai’s Commission of Economy and Information deputy director Shao Zhiqing, the goal of the app is to help “residents learn they’ll be rewarded if they’re honest”.
The app rewards good citizens by offering discounts and other rewards for good behaviour.
It has drawn comparisons to an episode of Black Mirror, which is set in a world where people rate each other on every interaction. Better ratings improve a person’s socioeconomic status, whereas bad ratings can cause their prospects to nosedive.
Facebook rating score: Room for improvement
Brogger says that a lack of transparency could be harmful to Facebook’s already fragile reputation.
“Pigeonholing users into ‘trust’ categories is a dangerous game to play, and Facebook is playing with fire if they continue to shut users out,” he said.
“Gaining user trust is make or break for any organisation. Companies need to ensure that the applications they use are fully open sourced and independently audited, so their software can be held to account if they are to instil that trust.”