Members of the US Congress were left doubting big tech’s willingness to fight misinformation online after grilling the CEOs of Facebook, Google and Twitter for over five hours yesterday. The session offered several hints at the future of US data regulation.
While the CEOs spent most of the session dodging any responsibility for the spread of disinformation and extremism on their platforms, they did offer some ideas on how they could be better regulated. Unsurprisingly, none of those suggestions seemed to call for any radical shift to their business models.
The House Committee on Energy & Commerce had called the interrogation in February, just weeks after the violent riot on Capitol Hill. The insurrection came against an increasing background of extremism, conspiracy theories, hate speech and simple crackpot lunacy spreading online.
The torrent of misinformation intensified in 2020, with falsehoods about Covid-19 and the US presidential election spreading across social media platforms. Following the November vote, claims that the Democrats had stolen the election gained increasing momentum. Former President Donald Trump and his supporters backed these assertions. Matters came to a head with the storming of the US Congress. The riot cost five people their lives and left over 140 injured.
Several of the congresspersons on yesterday’s committee panel had experienced the attack first hand, as they repeatedly reminded the tech bosses during the five-hour marathon session, repeatedly asking why they’d failed to do more to halt the spread of fake news.
“These are serious issues and, to be honest, it seems like you all just shrug off billion-dollar fines,” said Mike Doyle, chair of the committee. “Your companies need to be held accountable. We need rules, regulations, technical experts in government and audit authority of your technologies. Ours is the committee of jurisdiction and we will legislate to stop this. The stakes are simply too high.”
Several speakers echoed the sentiment. Consumer Protection and Commerce Subcommittee chair Jan Schakowski said she was introducing changes to Section 230 of the Communications Decency Act of 1996, a central internet law. Her changes would remove the protection tech firms enjoy against accountability for third-party content on their platforms.
“Self-regulation has come to the end of its road,” she said.
Facebook founder Mark Zuckerberg welcomed the push for more oversight, albeit with some caveats.
“[We] are ready to work with you beyond hearings to get started on real reform,” Zuckerberg said.
The Facebook chief added that he wouldn’t want to deny people the right to post about their lived experiences, even if those posts were not verifiably true.
“I don’t think anyone wants a world where you can only say things that private companies judge to be true,” he said. “Where every text-message, email and video and post has to be fact-checked before you hit send. But at the same time we also don’t want misinformation to spread that undermines confidence in vaccines, stops people from voting or causes other harms.”
Zuckerberg has voiced support for amends to Section 230 at previous hearings. During Thursday’s virtual probe, he reiterated his idea that tech firms should no longer be granted immunity, but that they should be able to demonstrate that they have systems in place for how they identify and remove harmful and unlawful content. This, he said, could be done in three ways.
“One is around transparency that large platforms should have to report at a regular cadence for each category of harmful content, how much of that harmful content they are finding and how effective their systems are in dealing with it,” Zuckerberg argued.
Secondly, he suggested clear standards for large platforms on how effective their systems should be on handling “clearly illegal content like opioids or child exploitation or things like that.”
Thirdly, he believed these new rules should only apply to larger firms, not startups.
“[When I] started out with Facebook, if we’d gotten hit with a lot of lawsuits around content, it might have been prohibitive for me to get started,” Zuckerberg said. “And I wouldn’t like to see the next set of startups being stopped from being able to get started and grow.”
Facebook already deploys both human and artificial intelligence-based fact-checking, has introduced an Oversight Board, and has partnered with an “independent” fact-checking charity largely funded by itself. It seems likely that the kind of system Zuckerberg envisions for the rest of the industry would take similar forms.
Zuckerberg also said that he was open to new US privacy regulations, despite Facebook already being entangled in several General Data Protection Regulation (GDPR) cases in the EU.
“I do believe that the US should have federal privacy legislation because I think we need a national standard,” he said. “And I think having a standard across the country that is harmonised with standards in other countries would actually create clearer expectations of industry and make it better for everyone.”
The Facebook CEO did caveat the remark though, saying that strangling the flow of data could restrict research and innovation.
Several politicians doubted Zuckerberg’s sincerity. For instance, congresswoman Lori Trahan noted how Facebook and Google-owned YouTube are currently building children-focused platforms.
“This committee is ready to legislate to protect our children from your ambition,” she said. “You know, what we’re having a hard time reconciling is that while you’re publicly calling for regulation, which by the way comes off as incredibly decent and noble, you’re plotting your next frontier of growth, which deviously targets our young children.”
She compared it with the old playbook of alcohol and big tobacco companies.
“Start them young and bank on them never leaving. Or at least never being able to,” Trahan said. “But these are our children and their health and their well-being deserves to take priority over profits.”
Contrary to Zuckerberg, Sundar Pichai, CEO of Google’s parent company Alphabet, opposed changes to Section 230, warning that it would have unintended consequences.
“Without Section 230, platforms would either over-filter content or not be able to filter content at all,” Pichai said in his prepared remarks. “Section 230 allows companies to take decisive action on harmful misinformation and keep up with bad actors who work hard to circumvent their policies.”
Pichai stated that Zuckerberg had offered some “definitely good proposals” around transparency and accountability, but stopped short of endorsing the Facebook CEO’s ideas. He also expressed his support for federal privacy legislation.
Twitter CEO and founder Jack Dorsey, who was caught tweeting during the hearing, also opposed changes to Section 230. He argued that too much legislation would suffocate innovation and risk creating a situation where the government could decide what could be posted. At the same time, he argued that free speech was too important to be left in the hands of individual companies.
“I don’t think we should be the arbiters of truth and I don’t think the government should be either,” Dorsey said.
His solution was to create an open source and shared protocols. Incidentally, Twitter announced an initiative along those lines in 2019.
“We’ve started work on such a protocol, which we call bluesky,” Dorsey told Congress. “It intends to act as a decentralised and open source social media protocol, not owned by any single company or organisation. Any developer around the world can help develop it just as any company can access its services.”
This, he argued, would create more transparency, boost innovation and “allow all of us to observe and acknowledge and address any societal issues that arise much faster.”
“Having more eyes on the problems will lead to more impactful solutions that can be built directly into this protocol, making the network far more secure and resilient,” Dorsey said. “A decentralised open source protocol for social media is our vision and work for the long term.”
Despite these initiatives, lawmakers believed that more needed to be done to ensure the tech titans’ accountability.
“This panel has done something truly rare in Washington D.C., it’s united Democrats and Republicans. Your industry cannot be trusted to regulate itself,” congresswoman Angie Craig concluded.