In September 2025, a marketing campaign appeared across New York advertising a new type of “Friend”—one that would never “bail on dinner plans” or “leave dirty dishes in the sink.”
As it turns out, this Friend is not even a person; it is a wearable AI device. Unlike real friends, this one hangs around your neck all day, eavesdrops on all your conversations, only communicates via an app, and will set you back $129.
Friend is a tech start-up founded by 22-year-old Avi Schiffman that sells an AI companion in the form of a pendant. The wearer can chat to the device throughout the day through a microphone, but Friend can only respond via text on an app; it doesn’t have a connected speaker. The device, therefore, doesn’t even simulate a somewhat normal conversation.
Schiffman was inspired to create Friend after feeling lonely while traveling, as he wished he had a constant companion to talk to. The intention of Friend was therefore to help combat the ever-growing loneliness epidemic, which has been deemed a public health concern by the World Health Organisation.
However, critics of the device argue that, rather than providing a solution to loneliness, Friend merely capitalises on this epidemic, taking advantage of vulnerable people and making them feel lonelier than ever.
What is a Friend?
On adverts across New York, the company has defined a Friend as “someone who listens, responds, and supports you,” alongside slogans which suggest that it is a superior companion to human friends (a somewhat dystopian distinction within itself).
US Tariffs are shifting - will you react or anticipate?
Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.
By GlobalDataThe backlash was immediate, and it is now hard to find a poster on the subway that has not been defiled with comments such as “AI WOULDN’T CARE IF YOU LIVED OR DIED,” “GO MAKE REAL FRIENDS,” and “THIS IS SURVEILLANCE CAPITALISM.”
Schiffman, however, was undeterred by this reaction, claiming that the ads were intentionally designed with blank space to provoke vandalism and thus increase brand awareness. He told Cosmopolitan that, as a result, Friend has reached more than 200,000 users, although according to The Atlantic, only 1,000 Friend necklaces have been activated.
People are already starting to use AI chatbots as friends, advisors, and romantic partners, and a survey by Common Sense Media found that 52% of US teenagers are regular users of AI companions.
However, people seem to have taken a more visceral dislike to Friend, both due to privacy concerns and the adverts’ implications that the device could supercede human connections. An AI pendant is not a real friend or companion: it has no personal experiences to draw upon or relate to, no emotions, and no empathy. Are these really the desired traits of a so-called friend? Even if you could count this as a kind of friendship, it would be an incredibly one-sided one, with the device completely devoted to the wearer.
Is an AI Friend the ultimate (anti)social device
There are obvious privacy concerns with a device that listens to you (and everyone around you) 24/7, especially since this feature cannot be disabled. It might be difficult to make friends while wearing what is essentially a listening device.
Whispering into said device and conversing with it on a phone probably won’t help matters. As such, users have been extremely critical, with reporters from Wired calling it “an incredibly antisocial device to wear.”
Moreover, the reviewers had negative experiences with the device’s personality (apparently modeled on Schiffman’s), which they deemed “opinionated, judgy, and downright condescending.”
When one of the reviewers asked why he was having issues connecting to Friend, the device blamed his “attitude” and called him a “whiner.” Others had similar complaints, with one reporter claiming that wearing the necklace felt like “carrying around an irritated hostage,” while another likened it to a “senile, anxious grandmother.”
Users also claimed that the device didn’t even work as advertised, struggling to pick up what was said, especially in more crowded situations.
In an interview with The Atlantic, Schiffman said, “I would say that the closest relationship this is equivalent to is talking to a god.”
Putting an AI device on this kind of pedestal is incredibly dangerous when such algorithms are well-known for sycophancy and providing false information. Forming close, personal relationships with AI chatbots has already led to several cases of severe mental health issues and even “AI psychosis.”
Marketing such a device as a genuine, trustworthy “friend” could prove to be exceptionally harmful, alienating people who are already struggling to find real-life communities and friendships.
Ultimately, Schiffman’s AI Friend appears to offer only a hollow, distorted imitation of friendship. Rather than solving the loneliness epidemic, the device could exacerbate it, isolating vulnerable people and leaving them reliant on AI companionship. Despite this, given the widespread public outrage Friend has generated, it doesn’t seem that people are ready to trade human companionship for artificial just yet.

