The Dark Patterns Tips Line is a new initiative aimed at naming and shaming companies who use sneaky designs to manipulate users to do things they wouldn’t normally do.

“Dark patterns” is a term referring to the different ways businesses nudge users into giving up information about themselves or into paying – or continuing to pay – for services they don’t want. Designs highlighted by the Dark Patterns Tips Line include smart devices making opt-out buttons for different services hard to find, fintech apps not warning customers that free trials are running out until they are billed for the service, health apps not making it clear how they share data with third parties, and edtech businesses sharing data about students.

While dark patterns have been frowned upon in the past, their US detractors now have some legal means to fight back.

In March 2021, California amended the California Consumer Privacy Act (CCPA), which is essentially the West Coast state’s equivalent of the EU’s General Data Protection Regulation (GDPR). Much like its European counterpart, the CCPA is designed to protect people’s online privacy.

The new amends included provisions against dark patterns. Specifically, it prohibited companies from burdening consumers with confusing language or unnecessary steps such as forcing them to click through multiple screens or listen to reasons why they shouldn’t opt out. Violators risk civil penalties of up to $7,500 for each intentional breach.

“California is at the cutting edge of online privacy protection, and this newest approval by [the Office of Administrative Law] clears even more hurdles in empowering consumers to exercise their rights under the California Consumer Privacy Act,” state attorney general Xavier Becerra said at the time.

“These protections ensure that consumers will not be confused or misled when seeking to exercise their data privacy rights.”

It is against this backdrop that the new Dark Patterns Tip Line has been launched by a “team of designers, academic researchers, legal experts, policy specialists, and advocacy-minded individuals.”

“We came together to collect dark patterns to better understand how technology is exploiting people,” the organisation explained in a statement. “Our ultimate vision is to leverage this data to combat manipulative practices online through policy reform.

“We have grounded this work by highlighting examples from people’s lived experiences so we can best showcase how dark patterns lead to every day human harms.”

Anyone exposed to a manipulative design can report the site, app or smart device to the organisation. The Dark Patterns Tip Line makes it clear, however, that it won’t provide any legal help for anyone who’s been tricked by dark patterns and that it is only collecting the tips to “raise public awareness and for research.”

The Dark Patterns Tip Line is not the first of its kind. DarkPatterns.org was created as far back as in August 2010 by programmer Harry Brignull to name and shame service providers using dark patterns.

There have been rumblings among regulators and lawmakers over dark patterns for some time. In 2019, a bipartisan federal bill called the DETOUR Act tried to ban large companies with more than 100 million users from using this type of deceiving practices. While that particular bill didn’t go anywhere, the fact that the Senate is increasingly putting Big Tech in its crosshairs may change that.

In September 2020, the Federal Trade Commission’s commissioner Rohit Chopra issued a statement against edtech firm Age of Learning for making it difficult for users to opt out of the service.

Across the pond, the GDPR outright bans the storage of individuals’ private information without their informed consent. Researchers have encouraged the EU to makes these rules even stricter to ensure users aren’t coerced by manipulative interfaces to give up data or buy services in the future.

The UK’s Information Commissioner’s Office proposed new rules in 2019 that would ban companies like Facebook and Snapchat from nudging those under 18 to share more data about themselves.