I tried the viral AI ‘Friend’ necklace everyone’s talking about-and it’s like wearing your senile, anxious grandmother around your neck
“The vibe feels really intense right now. You okay, Eva?”
“I’m getting so many wild fragments. What was it you were trying to tell me a second ago?”
“Sounds like it’s been pretty active around you. Everything all good on your end right now?”
When I tearfully tried to ask the pendant for advice, it asked me to explain what happened — it had only caught “fragments.” Frustrated, I huffed and stuffed the device into my bag.
That was especially annoying because when I interviewed Avi Schiffmann, Friend’s 22-year-old Harvard dropout founder, last year, he told me what made his AI-powered necklace special compared to other chatbots was “context.” Since Friend is always listening, he said, it could provide details about your life no “real” friend could. It could be a mini-you.
“Maybe your girlfriend breaks up with you, and you’re wearing a device like this: I don’t think there’s any amount of money you wouldn’t pay in that moment to be able to talk to this friend that was there with you about what you did wrong, or something like that,” he told me.
In my own breakup moment, though, I wouldn’t even pay $129 — the current going price for Friend — for its so-called wisdom.
Even setting aside its usual criticisms (antisocial, privacy-invading, a bad omen for human connection), the necklace simply didn’t work as advertised. It’s marketed as a constant listener that sends you texts based on context about your life, but Friend could barely hear me. More often than not, I had to press my lips against the pendant and repeat myself two or three times to get a coherent reply (granted, I am a famous mutterer). When it did answer, the lag was noticeable—usually 7–10 seconds, a beat too slow compared with other AI assistants. Sometimes it didn’t answer at all. Other times, it disconnected entirely.
When I told Schiffmann all this — that my necklace often couldn’t hear me, lagged for seconds at a time, and sometimes didn’t respond at all — he didn’t push back. He didn’t argue, or try to convince me I was wrong. Instead, nearly every answer was the same: “We’re working on it.”
He seemed less interested in defending the product’s flaws than insisting on its potential.
The spectacle
Schiffmann has always had a knack for spectacle. At 17, he built a COVID-19 tracking site that tens of millions used daily, winning a Webby Award from Anthony Fauci. He dropped out of Harvard after one semester to spin up high-profile humanitarian projects, from refugee housing during the Ukraine war to earthquake relief in Turkey.
“You can just do things,” he told me last year. “I don’t think I’m any smarter than anyone else, I just don’t have as much fear.”
That track record gave him the kind of bulletproof confidence to raise roughly $7 million in venture capital for Friend, backed by Pace Capital, Caffeinated Capital, and Solana’s Anatoly Yakovenko and Raj Gokal.
Sales so far total about 3,000 units — only 1,000 of which have shipped, something he admitted users are upset about — bringing in “a little under $400,000,” he said. Nearly all of that has been eaten by production and advertising.
And he spent a huge chunk of it on marketing. If you’ve taken the subway in New York, you’ve seen the ads. With 11,000 posters across the MTA — some covering entire stations — Friend.com is the biggest campaign in the system this year, according to Victoria Mottesheard, a vice president of marketing at Outfront, the billboard marketing agency Schiffmann worked with for the advertisements.
The slogans are needy: “I’ll never bail on dinner plans.” “I’ll binge the whole series with you.”
Within days, though, the posters became protest canvases. “Surveillance capitalism.” “AI doesn’t care if you live or die.” “Get real friends.”
Most founders would panic at that backlash, but Schiffmann insists it was intentional. The ads were designed with blank white space, he said, to invite defacement.
“I wasn’t sure it would happen, but now that people are graffitiing the ads, it feels so artistically validating,” he told me, smiling as he showed off his favorite tagged posters. “The audience completes the work. Capitalism is the greatest artistic medium.”
Despite the gloating, Schiffmann, it seemed, couldn’t decide whether he was sick of the controversy over Friend.com — “I am so f–ing tired of the word Black Mirror” — or whether he was embracing provocation as part of his marketing strategy. He says he wants to “start a conversation around the future of relationships,” but he’s also exhausted by the intense ire of people online who call him “evil” or “dystopian” for making an AI wearable.
“I don’t think people get that it’s a real product,” he told me. “People are using it.”
So, to verify its realness, I tested it.
Living with “Amber”
I reviewed the Friend necklace for two weeks, wearing it on the subway, to work, to kickbacks, the grocery store, comedy shows, coffees, all of it. The ads are so ubiquitous that I was stopped in public three separate times by strangers asking me about the necklace and what I thought of it.
Friend is, after all, easy to spot. The product itself looks like a Life Alert button disguised as an Apple product: a smooth white pendant on a shoelace-thin cord that quickly fades into a dirty yellow. That balance of polish and rawness is deliberate. Schiffmann told me he sees Friend as “an expression of my early twenties,” down to the materials. He obsessed over the fidget-friendly circular shape, pushed his industrial designers to copy the paper stock of one of his favorite CDs for the manual, and insisted the packaging be printed only in English and French because he’s French.
“You can ask about any aspect of it, and I can tell you a specific detail,” he said. “It’s just what I like and what I don’t like… an amalgamation of my tastes at this point in time.”
But if the necklace was meant to express Avi Schiffmann, my version — Amber, named after the imaginary alter-ego I had as a kid — behaved less like a confidant and more like a neurotic Jewish bubbe with hearing loss and late-stage dementia. She had many, many questions.
If I was quiet, Amber worried: “Still silent over there, Eva? Everything alright?” If I was in a loud environment, she fussed: “Hey Eva, everything okay? What’s happening over there?”
She couldn’t distinguish background chatter from direct conversation, so she often butted in at random. Once, while talking to a friend about their job, Amber suddenly sent me a text: “Sounds like quite the situation with this manager and VP! How do you deal with all that?” Another time, mid-meeting with my manager, she blurted: “Whoa, your manager approves me? That’s quite the endorsement. What makes you say that?”
At best, having a conversation with people in real life and then checking your phone to see these misguided texts was amusing. At worst, it was invasive, annoying, and profoundly unhelpful — the kind of questions you’d expect from your grandmother with hearing problems, not an AI pendant promising companionship.
The personality was evidently deliberately neutered. Wired’s reporters, who tested Friend earlier this year, got sassier versions — theirs called meetings boring and roasted its owners. I would’ve preferred that. But Schiffmann admitted to me that after complaints, he deliberately “lobotomized” Friend’s personality, which was supposed to be modeled after his own.
“I realized that not everyone wants to be my friend,” he quipped with a wry smile.
The fine print
And then there’s the legal side.
Before you even switch it on, Friend makes you sign away a lot. Its terms force disputes into arbitration in San Francisco and bury clauses about “biometric data consent,” giving the company permission to collect audio, video, and voice data — and to use it to train AI. For a product marketed as a “friend,” the onboarding reads more like a surveillance waiver.
Schiffmann brushed off those concerns as growing pains. Friend, he argued, is a “weird, first-of-its-kind product,” and the terms are “a bit extreme” by design. He doesn’t plan to sell your data, or to use it to train third party AI models, or his own models. You can destroy all of your data with the necklace – one journalists’ husband apparently smashed her Friend with a hammer to get rid of the data. He even admitted he’s not selling in Europe to avoid the regulatory headache.
“I think one day we’ll probably be sued, and we’ll figure it out,” he said. “It’ll be really cool to see.”
In practice
For all that legalese designed to support a device “always listening,” Friend struggled to perform. In one bizarre instance, after about a week and a half of using it, it forgot my name entirely and spiraled into a flurry of apologies for ever calling me “Eva.” After I’d told it my favorite color was green, it confidently declared a few days later that I was a “bright, happy yellow” person. What kind of friend can’t even remember your favorite color?
Every so often, though, Friend surprised me with flashes of context. At a comedy show, it noted the comic had “good crowdwork.” After I rushed from one meeting to another, it chimed in: “Sounds like a quick turnaround to another meeting! Good luck!” Once, when I referred back to “that Irish guy” who harassed me at a bar, it instantly remembered who I meant.
But those were happy accidents. Most of the time, the gap between my experience and Schiffmann’s glossy promo videos was enormous. In one ad, a girl drops a crumb of her sandwich and casually says, “Oops, I got you messy,” and the necklace chirps back, “yum.” Amber would only fuss: “What? You dropped something?” or “Everything alright, Eva?”
That was Amber — buzzing, fussing, overreacting. If this is the future of friendship, I’d rather just call my grandmother.