Social media companies are expected to move away from their ad-funded business model that has been criticised by regulators towards a super-app model. A super-app model may, however, attract more scrutiny from regulators over antitrust and data privacy regulations.
Listed below are the key regulatory trends impacting the social media theme, as identified by GlobalData.
Reform of the digital advertising industry set in motion by the General Data Protection Regulation (GDPR) is at a critical juncture. Cookies enable hyper-targeted ads, which are of great value to advertisers, but they also pose risks to individual users’ privacy. Social media companies, which rely on advertising revenues, are rushing to adapt to these changes and, in some cases, even create a new playing field.
Google’s Sandbox project is trying to redesign the ad-funded business model for the whole industry, using federated learning of cohorts (FLoC), which clusters large groups of people with similar interests. The approach “effectively hides individuals ‘in the crowd’ and uses on-device processing to keep a person’s web history private on the browser.” Critics, however, argue that it will create a walled garden and weaken publishers competing against Google to monetise ads.
Apple and Facebook are also battling it out over data privacy. Apple iOS 14 prevents iPhone tracking by default. Apps such as Facebook will be required to ask users for permission to track them, which is expected to dramatically impact advertisers’ ability to target ads as people are unlikely to opt-in. Facebook has accused Apple of using its dominant platform to interfere with how apps work. It claims that Apple wants to move the free, ad-supported internet into paid apps and argues that changes to ad targeting will particularly affect small businesses.
Social media companies have come under fire for stifling competition and face an unprecedented level of scrutiny on both sides of the Atlantic. Large social media companies could easily become monopolies, due to their network effects. The UK’s Competition and Markets Authority’s (CMA) 2020 report on online platforms and digital advertising concluded that the competitive threat to Facebook is limited by network effects and economies of scale inhibiting entry and expansion of other platforms. Overall, there is a strong case to introduce ex ante regulation with strict and transparent rules applied to digital platforms before they engage in any anticompetitive behaviour.
How well do you really know your competitors?
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
At the European Union (EU) level, the proposed Digital Markets Act (DMA) aims to constrain platforms so that they don’t become monopolies. It defines scale in terms of both global turnover and the number of users, aiming to identify gatekeepers that require monitoring. The Department of Justice (DoJ) and Federal Trade Commission (FTC) lawsuits against Facebook and Google have raised the prospect of companies being forced to spin off specific business units or subsidiaries. Together, the cases represent a watershed moment in US antitrust enforcement and an escalation of regulatory pressure on the technology sector.
A consensus is emerging that governments should hold social media companies responsible for the content they publish, as it can encourage anti-social and criminal behaviour. Regulators are concerned that online content aggregators, like Facebook, Google, Twitter, Baidu, Tencent, and Weibo, could undermine social norms unless regulated like media companies. Action to mitigate misinformation, however, must be balanced with the right to freedom of expression, allowing individuals to hold opinions and receive and share ideas without state interference.
We are starting to see legislation directed at curbing misinformation enacted around the world. Germany passed the Network Enforcement Act in 2018, under which platforms can be fined for hosting unlawful content, including defamation and incitement to hatred, and for not removing such posts within 24 hours. By requiring social media companies to remove illegal speech swiftly, such laws will reduce online misinformation. However, they also bypass important due process measures and incentivise social media companies to censor content rather than risk a fine.
Advertising revenue and content subscriptions have plummeted as a result of Covid-19, and thousands of journalists have either been furloughed or laid off. Media industry lobbyists and lawmakers see this as an opportunity to demand tech giants give more money to support struggling media outlets. Since 2020, Facebook and Google have been under pressure to pay news publishers for displaying their content.
Australia introduced new copyright measures that forced Big Tech companies to pay publishers for snippets of content that appeared on their platforms. In Europe, the proposed Digital Services Act (DSA), regulating tech platforms’ content, could be amended to include the option of binding arbitration for licensing agreements. It would also require tech companies to inform publishers about changes to the methodology used to rank news stories on their sites.
In Australia, both Google and Facebook are fighting against the legislation before parliament. The dispute escalated in February 2021, when Facebook blocked Australian users from viewing or sharing news content on the platform, causing much alarm over public access to information. After the news blackout, Facebook was forced to enter into negotiations with publishers. The social media giant received fierce criticism, particularly after critical emergency services and health pages were inadvertently cut off from the platform. NewsCorp, controlled by Rupert Murdoch, struck a three-year deal to provide news to Facebook, the financial details of which have not been disclosed.
This is an edited extract from the Social Media – Thematic Research report produced by GlobalData Thematic Research.