OnlyFans is in talks to raise funds in a move that will see the company valued over $1bn. The social media platform connects content creators with their fans who pay $5-$50 a month. But nascent regulations aimed at protecting the vulnerable will impact the company.

Content creators on OnlyFans include professional boxer Floyd Mayweather and rapper Cardi B. Fashion designer Rebecca Minkoff launched an account in 2021 to showcase backstage footage from New York Fashion Week. Non-celebrity creators include artists, chefs, and fitness instructors. But the platform is infamous for its user-generated explicit adult content.

Ad space is available

The startup is approaching backers to help shift its reputation towards becoming a mainstream media platform and, crucially, attracting advertisers. This potential unicorn, which has rejected previous special purpose acquisition company (SPAC) propositions, could invigorate the UK tech market and London Stock Exchange – which suffered a blow after Deliveroo’s March 2021 initial public offering (IPO) flop – if it files for an IPO.

Boosted by Covid-19, OnlyFans handled over $2bn in sales in 2020. By charging a 20% fee to all content creators this generated over $400m in revenue. This contrasts with many social media platforms which rely on ads for profit. Although some advertisers may not want to be associated with OnlyFans to minimize reputational risk, there is potential to attract a different kind of advertiser.

Age is not just a number

Social media influencer is one of the few careers where women routinely out-earn their male counterparts. OnlyFans’ creators are largely women, who were disproportionately affected by Covid-19. Some content creators have become millionaires from their accounts. But for many, the platform provides a lifeline or a critical second income source as the site attracts many young people who view it as a safe way to make lots of cash fast.

However, there are also many vulnerable people on OnlyFans, some of whom may not have consented to creating or sharing content. There are also reports of missing children surfacing in images.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

OnlyFans claims that its age verification systems go ‘over and beyond regulatory’ requirements, and states that all users and creators must be over 18. But the site fails to prevent underage users from accessing, creating, appearing in, and selling explicit content. Nor is it safeguarding children from being exploited and appearing in explicit material.

OnlyFans technology is not capable of everything

The systems for age verification have been called into question. The site requires applicants to submit a selfie holding up their ID next to their face. However, given that a 14-year-old accessed the site using her grandmother’s passport and bank details, it appears the age verification fails to do its job. It’s clear that the AI-based facial recognition used is not fit for purpose.

When social media attracts the attention of teens and children, the attention of regulators and watchdogs is not far behind. In 2019, TikTok was handed a $5.7m fine in a US case surrounding children’s data privacy.

In the same year, the UK government published the Online Harms Bill to tackle the dangers posed by online content. Two years later, the May 2021 Online Safety Bill will see companies fined £18m or 10% of their global turnover if they fail to keep children safe on their platforms.

The draft legislation means social media sites must act on harmful content, even when legal. Otherwise, companies could face fines or criminal action from Ofcom. The legislation also encompasses misinformation, meaning that social media sites are likely to censor content rather than face a fine, even though this may impinge on free speech.