Facebook withheld a transparency report revealing that the most popular post in Q1 2021 was a link to an article spreading Covid-19 vaccine misinformation. For the Menlo Park-headquartered giant, being caught suppressing a report that made it look bad couldn’t come at a worse time.

Over the past year, it has been busy attempting to clean up its reputation as a haven for conspiracy theories, fake news and untruths. Facebook has done so as lawmakers, regulators and the public have upped the pressure on social media platforms to tackle the mis-infodemic, as noted by a recent GlobalData thematic research report.

Last week, Facebook published a report on the most popular posts, domains and links on its platform in Q2 in an effort to prove its commitment to transparency.

However, The New York Times later revealed that a previous report covering the period between January and March had been produced and not published. The newspaper had gotten hold of the Q1 report before Facebook itself published it late on Saturday.

The report reveals that the most-viewed link on Facebook in Q1 was a since-updated news story that suggested a Florida doctor’s death may be linked to the Covid-19 vaccine.

The New York Times reported that top executives had decided against publishing the Q1 report given the public relations problems it would cause.

Facebook replies

In response to the New York Times story, Facebook policy communications manager Andy Stone went on Twitter over the weekend, tweeting that getting “criticism isn’t unfair” but arguing that the Q1 report that included the Covid-19 story should be seen in the right context.

The policy communications director then went on a brief rant about what should count as misinformation and not.

“News outlets wrote about the south Florida doctor that died,” Stone continued. “When the coroner released a cause of death, the Chicago Tribune appended an update to its original story; NYTimes did not. Would it have been right to remove the Times story because it was Covid misinfo?

“Of course not. No one is actually suggesting this and neither am I, but it does illustrate just how difficult it is to define misinformation.”

On the question of why Facebook withheld the Q1 report but published the Q2 one, Stone said it was “because there were key fixes to the system we wanted to make.”

“We’re guilty of cleaning up our house a bit before we invited company,” he added. “We’ve been criticised for that; and again, that’s not unfair.”

The second report

On August 18, Facebook released a report about how its US audience used the platform in the period between April and June 2021. The Widely Viewed Content Report: What People See on Facebook did pretty much what the first one did, with one big exception: it painted a rather rosier picture.

“Transparency is an important part of everything we do at Facebook,” the company said. “In this first quarterly report, our goal is to provide clarity around what people see in their Facebook News Feed, the different content types that appear in their Feed and the most-viewed domains, links, Pages and posts on the platform during the quarter.”

The most viewed post during Q2 was a post with a picture full of letters, telling the reader that the first three words they’d see “are your reality”. It was viewed more than 80.6 million times. The second most viewed post prompted those over the age of 30 that still looked young to share a picture of themselves. It was viewed 60.4 million times. Coming third was a post claiming that people’s porn names were their middle name and the first car they owned. It was viewed over 60.2 million times.

Other popular posts included US President Joe Biden celebrating his first 100 days in office, a reporter asking if people would ditch their masks and a post with several pictures depicting men in uniform helping minorities and people of colour.

The most popular pages included Unicef, UniLad and the Daily Mail. The most-viewed domains included YouTube, UNICEF, Amazon and Spotify.

Facebook’s ongoing Covid-19 fake news woes

Facebook struggles with Covid-19 misinformation. In the almost 20 months since the news about the pandemic started to spread from Wuhan, social media giants have faced intense scrutiny. And they have taken action, launching several initiatives to tackle the dissemination of untruths on their platforms.

“Covid-19 is likely to be a point of no return for big social media companies on misinformation,” a new report from GlobalData noted in June. “During the pandemic, these companies have actively tackled the dissemination of fake news. This shift to a more proactive stance on misinformation will likely be permanent, as, after the pandemic, they will be expected to keep policing harmful content.”

However, these initiatives have fallen short of appeasing the powers that be. For example in March, Facebook’s Mark Zuckerberg, Twitter’s Jack Dorsey and Sundar Pichai, CEO of YouTube owner Google, were grilled for five hours by US senators about their shortcomings in the battle against fake news, Covid-19 vaccine misinformation and the conspiracy theories that led to the Capitol Hill riot in January.

In July, Biden went so far as to say that the Menlo Park-headquartered company was “killing people” by allowing Covid-19 vaccine misinformation to spread on its platform. He later softened the message slightly by saying that it was superspreaders peddling the untruths that were the real problem, but that these had been hosted on popular sites like Facebook and YouTube.

Menlo Park then published a response in which it said it is “not the reason” for the Biden administration missing its goal of vaccinating 70% of Americans by 4 July. Facebook claimed that its data shows 85% of its US users “have been or want to be vaccinated against Covid-19”.