Bad Connections

Talking to someone online for emotional support may be riskier than you realize

These platforms promise to connect users to a compassionate listener, but the reality is far more complex.
By Rebecca Ruiz  on 
Speech bubbles colored red and blue float next to each other.
Peer support apps are supposed to help people with their mental health, but there are undisclosed risks. Credit: Bob Al-Greene / Mashable

An investigation into the risks and dangers of seeking comfort from strangers on digital emotional support platforms.


At a time when loneliness is a crisis, HearMe is Adam Lippin's calling. He founded the digital platform in 2018 as a place where a user can talk to someone online and "get something off your chest." The platform matches that user with a "peer listener" who's meant to be supportive. Both people can remain anonymous.  

But Lippin eventually learned that not everyone who logs onto a platform like HearMe has a sincere interest in making an emotional connection. In 2022, it became clear that some users were visiting HearMe to play out fantasies that involved sexual language and innuendo, Lippin told Mashable. 

On the other side of those messages were often psychology interns and graduate students in social work who volunteered on the service to fulfill their educational requirements. Lippin hoped the bad actors could be discouraged by responses that reframed or ended the conversation. But that didn't work. 

"It was like whack-a-mole," Lippin said. "It just didn't stop." 

So Lippin made a risky, consequential decision: HearMe stopped offering a free membership. Soon after, the problem largely ceased, Lippin said. 

"I learned a lesson," he said of online emotional support. "It's like anything — it can be used for good and bad."  

Lippin isn't the only founder and CEO to launch a company designed to alleviate loneliness by connecting strangers with each other. Companies like Wisdo Health, Circles, 7 Cups, and HeyPeers aim to fill gaps in a broken mental health care system by offering users the opportunity to talk to someone online. Like Lippin, some founders find their mission complicated by bad actors with other ideas about how to use the platform. 

A months-long Mashable investigation into these emotional support platforms, including the popular free service 7 Cups, found that users may be exposed to moderate or significant risk in their pursuit of consolation and connection. 


This story is part of our investigation into the emotional support platform 7 Cups and the growing marketplace for apps and platforms that pair people with someone who is supposed to be a compassionate listener. The series explores a failed experiment between the state of California and 7 Cups, as well as the myriad risks of seeking emotional support online from strangers. These dangers can include the manipulation of vulnerable youth and targeted abuse and harassment. The series also includes an analysis of why it's so hard to stop online child exploitation, and looks at solutions to make platforms safer.


In one 2018 case, a 42-year-old man posed as a 15-year-old teen on 7 Cups to access the platform's teen community. He manipulated a 14-year-old girl into creating child sex abuse material and was ultimately charged and jailed for the crimes. That same year, 7 Cups won a contract to provide its services to residents of certain California counties, but its contract was cut short in 2019 after safety concerns emerged, among other issues.

In general, risks on emotional support platforms include encountering an anonymous stranger who's well-meaning but ultimately hurtful, or a purposefully cruel bad actor who, for example, tells someone hoping to feel less alone to kill herself. 

While these issues were most egregious on 7 Cups, Mashable tested other platforms in this market, interviewed some of their members, and spoke with their CEOs, and found that 7 Cups' competitors have faced a range of challenges. These startups are under pressure to develop a successful, scalable business model, all while battling bad actors who find ways to circumvent common safety measures. 

It's not unlike what happens every day on the internet, but in this case the victims can be emotionally or psychologically vulnerable people who opened up to a stranger believing they were safe.   

Unlike in formal mental health treatment, there is currently little recourse for those who've been seriously harmed by their conversations on an emotional support platform. The field is largely unregulated, and federal law has traditionally immunized online platforms from liability in many instances when their users are harmed.

Meanwhile, if someone seeks compassion on an emotional support platform but finds predation and abuse instead, it may have lasting damage. 

"I think you have very real risk that somebody would view this as part of being the quote-unquote mental health system, and if they had a bad experience, I can imagine them never engaging in mental health again, or never seeking other types of treatment or support again," said Dr. Matt Mishkind, a researcher who studies technological innovation in behavioral health as deputy director of the University of Colorado's Helen and Arthur E. Johnson Depression Center.  

What is peer support? 

These companies often use the term peer support to describe their services. Most people who hear this probably imagine a reputable in-person or virtual group run by a mental health provider or organization. 

The National Alliance on Mental Illness' peer-to-peer program, for example, brings people coping with mental illness, or their families, together under the supervision of a trained facilitator. Research indicates that these programs may help with recovery.

Less familiar are peer support specialists, a growing workforce of trained individuals who draw on their own lived experience with mental illness or substance use to aid someone in recovery, in a clinical or outpatient setting. 

This type of intervention shows promise in clinical research for people with mental health conditions. Some studies note small to modest improvements in symptom remission and improved quality of life. Last year, Blue Cross and Blue Shield of Minnesota announced that access to peer support specialists would be a covered benefit for certain members beginning in 2024. 

Peer support specialists, however, do not staff all emotional support platforms. HeyPeers does allow certified peer support specialists to offer their services for a fee, and HearMe users may engage with them as well. 

This distinction between peer-to-peer support versus peer services led by trained individuals who adhere to standardized peer-practice guidelines is important. Someone who downloads an app marketed as offering peer support may not, in fact, talk to a trained peer professional.

How does peer support work? 

When a person does have a positive experience on an emotional support platform, it can be life changing. 

Mashable interviewed two participants of Circles' facilitated support groups, who said their weekly interactions with other members helped them feel less alone and more prepared to handle emotional challenges. 

That service is separate from Circles' free offering, which allows users to gather in hosted chat rooms, discuss topics like parenting, self-care, and workplace stress, and anonymously direct message each other. 

Once someone has received help on an emotional support platform, they may derive great satisfaction out of extending similar compassion to someone else, in a listener role, according to people who've used different platforms and spoke with Mashable about their experiences. 

Still, there is no high-quality research demonstrating that digital emotional support platforms are as effective as peer support specialists or even computer-based cognitive behavioral therapy treatments.

Some of the past studies on 7 Cups weren't rigorous or large enough to draw any conclusions. Four studies conducted between 2015 and 2018 were largely focused on testing the platform rather than establishing high-quality clinical claims of efficacy. Some of the studies had fewer than 20 participants. Regardless, the company continues to advertise its platform as "research-backed" and "evidence-based," a claim its founder and CEO Glen Moriarty defended to Mashable. He noted that the platform's "self-help guides" and "growth paths" are based on types of therapy shown to be effective, including cognitive behavioral therapy and dialectical behavioral therapy.

Other companies have published their own research. 

Last year, Wisdo Health published a study in JMIR Research, which found that users experienced decreased loneliness and depression symptoms, among other improvements, after the platform. The authors also noted that a randomized controlled trial that compared "peer support" to interventions like cognitive behavioral therapy "would be a valuable contribution to the literature."

"It's an exciting moment to be working in this space because it's graduating to a depth of conversation which I'm not sure that peer support has enjoyed in the past," Wisdo Health founder and CEO Boaz Gaon told Mashable in an interview last year. The company, which was founded in 2018 and claims to have 500,000 users, offers clinical referral services to users who are identified as potentially benefiting from therapy. 

Ryan K. McBain, a policy researcher at the RAND Corporation who has examined the efficacy of peer support specialists in the mental health system, told Mashable in an email that peers seem to be most effective when they meet a minimum set of criteria, receive standardized training, have supportive supervision, and are well-integrated into the overall health system. Emotional support platforms often lack these safeguards and provide minimal training.  

McBain said he doubted that untrained individuals would have the "full set of tools" required to support a client, or user, in the same manner as someone who underwent full peer support specialist certification. While he sees value in empathetic listening, particularly from those with lived mental health experience, he believes emotional support platforms need to be fully transparent about what they are — and what they're not. 

"I am not discounting the possibility that these platforms may prove to be a disruptive innovation over the long-run — but they require regulation, and the government is in a position of playing catch-up," McBain said. 

When talking to someone for free on the internet became a big business 

Though it took time, the isolation of the COVID-19 pandemic, as well as the loneliness epidemic, supercharged the concept of digital peer support as a business proposition.  

Wisdo Health has raised more than $15 million from investors like 23andMe founder Anne Wojcicki and Marius Nacht, an entrepreneur who cofounded the healthtech investment fund aMoon. 

Mashable Top Stories
Stay connected with the hottest stories of the day and the latest entertainment news.
Sign up for Mashable's Top Stories newsletter
By signing up you agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

The company describes itself as a "social health" platform, emphasizing that it measures changes in people's perception of their loneliness, among other emotional indicators. Users can access the platform for free, but the majority are sponsored by an employer. 

Circles, a competitor to Wisdo Health, has raised $27 million since its founding. 

Other companies have raised far less money. STIGMA, which folded at the end of 2023, was initially bootstrapped by its founder and CEO Ariana Vargas, a documentary filmmaker. HearMe has raised approximately $2 million. It partners with third parties and offers two subscription tiers; a weekly membership is $7.99 while an annual membership is $69.99. 

HeyPeers generates most of its revenue by hosting and staffing video-based support groups for nonprofits. Independent members can join for free. They can participate in HeyPeers support groups, which are facilitated by certified peer support specialists, for $10 per meeting.  

Both Circles and Wisdo Health have pivoted away from a subscription strategy, focusing on landing contracts with major payers like insurers and employers. In March 2023, Wisdo Health partnered with a nonprofit organization in Colorado to make the platform available to adult residents, with a particular emphasis on reaching Medicaid recipients.

In 2018, 7 Cups received a multimillion-dollar contract from the California Mental Health Services Authority to provide the platform to residents in certain counties, but that project was quietly terminated after safety issues, including abusive and sexually explicit behavior, became a concern, according to sources involved in the initiative who spoke to Mashable.

Balancing growth and safety

Rob Morris, CEO of the youth emotional support platform Koko, incorporated it as a nonprofit in 2020, after originally cofounding it as a for-profit company. The shift was motivated partly by Morris' decision not to sell user data or sell the platform to third parties, like employers or universities.

"I think it's hard to find a business model in this space, particularly if you're reaching underserved individuals or young people, that doesn't create misaligned incentives," he said. "We just couldn't find a business model that made sense ethically for us." 

He noted that platforms under pressure to demonstrate high engagement may hesitate to create robust safeguards. 

"The more moderation you put in place, the more constraints you put in place, the less user engagement or attention you get," he said. 

Recruiting users for emotional support platforms often requires a low bar to entry, like free access to services and anonymity. At the same time, these features can create risky or dangerous conditions on the platform. 

Companies may also find ways to derive additional value from the users themselves. Lippin, CEO of HearMe, told Mashable that one of its business deals involves providing its listening service to nurses at a time when burnout is causing a shortage in the profession. 

HearMe aggregates and anonymizes what the nurses share and relays that to their employer, which wants to identify workplace concerns or complaints that might affect their well-being. Lippin said the terms of service indicated to consumers that their data could be used in this way. 

STIGMA, a platform designed for users to receive support when talking about their mental health, tested sponsored content prior to shutting down at the end of 2023. Vargas, the company's founder and CEO, told Mashable that she didn't want to advertise to users, but instead hoped to present users with content "sponsored by the people who want their brands in front of our member base." The founders of Wisdo Health and Circles both told Mashable that they are opposed to advertising.

Many emotional support platforms rely, in some way, on the free labor of volunteer listeners. 7 Cups has uniquely been reliant on volunteer labor to perform critical tasks since its founding. 

"We deliberately designed the platform with a volunteer emphasis from the very beginning, because that appears to be one of the only ways to scale emotional support," Moriarty told Mashable.  

On Wisdo Health, a user can become a "helper," an unpaid community leadership role, after graduating from a training program made available to highly engaged and helpful users. They receive a helper badge only if they pass a training test and continue to demonstrate high levels of helpfulness to others, as assessed by the platform's algorithm. Helpers are expected to check on a certain number of users each day. Roles above helper are filled by paid staff members. 

HearMe uses a combination of paid and volunteer listeners, including graduate students pursuing a social work degree who need the experience to meet their program's requirements. The company vets graduate students, psychology interns, and certified peer specialists against a list of "excluded" individuals maintained by the Office of the Inspector General at the Department of Health and Human Services. The list comprises individuals who violated certain laws, including by committing patient abuse and health care fraud. 

The amount of training volunteer listeners receive varies widely. 7 Cups requires users to complete an "active listening" course in order to become a listener who takes chats. It also hosts numerous other trainings, but they are optional. Circles members who want to become a "guide" and host their own chat room must apply and, once accepted, receive facilitator training. 

In general, volunteer support and listening is often sold to consumers as a fulfilling way to give back, perhaps not unlike how one might volunteer for a crisis line. Those organizations, however, are typically nonprofits, not startups with the backing of venture capital and an eye toward potentially being acquired.  

Safety challenges on emotional support platforms

Founders of emotional support platforms often share a compelling personal story about why their product is critical at a time when loneliness is surging and mental health is declining. 

Gaon has said that his father's battle with terminal cancer led to the platform's creation. Irad Eichler, founder and CEO of Circles, said that his mother's experience with cancer, and the support he received from friends, prompted him to build a "place for people dealing with any kind of emotional challenge." 

For consumers, the assumption undergirding the concept of an emotional support platform is that people will use access to such a network for good. The reality, however, is far more complicated. 

Eichler is candid about the fact that some people occasionally join the platform with "different motivations, and not with the best intentions," even if the vast majority of interactions are positive or supportive.  

That's why both members and paid staff moderate rooms to make sure discussions are on topic and that conversation is respectful. Eventually, said Eichler, artificial intelligence will police all the rooms on a constant basis and alert the company to bad behavior. Moriarty, of 7 Cups, told Mashable the company was working on deploying a similar solution, including for one-on-one chats.  

Users on both platforms can manually report negative experiences.  

Offenses met with an immediate ban on Circles include violent or inappropriate language, aggressive behavior toward others, noncooperation with group facilitators, and taking over a chatroom against the protest of other users. 

"It's an ongoing challenge," Eichler said of the risk bad actors present to emotional support platforms. "It's not something that you can solve. There's a tension that you will always need to manage. I don't think we will hit the place where Circles will be a 100-percent safe space." 

Eichler was emphatic that safety was a priority, as were the CEOs of Wisdo Health, HeyPeers, HearMe, and 7 Cups. 

Yet each major emotional support platform also employs anonymity, which can create unique risks.  

On 7 Cups, bad actors and predators have taken advantage of anonymity. Abusive behavior on the platform has included sexual and violent language, including directing users to kill themselves, according to former and current staff and volunteers who spoke to Mashable. 

On HeyPeers, which allows teens to join, CEO Vincent Caimano told Mashable that, last year, the platform's staff caught a man appearing to flirt with a teen girl in a chatroom about depression. The room, which had been unmoderated overnight, was open for conversation among anonymous users. When the public exchanges were noticed in the morning, Caimano banned the adult user and staff reached out to the teen about the incident. The company also shut down chat rooms that weren't moderated consistently enough by their host, which means checking in every day and participating in conversation. In general, HeyPeers conducts background checks on its staff and contractors via the service Checkr.

Gaon defended Wisdo Health's use of anonymity. He told Mashable that the company had encountered past situations in which people didn't feel comfortable sharing information with a listener if it could be traced back to them, and that he wanted the platform to cater to both those who want to publicly identify themselves and those who don't.

"If you don't allow anonymity, you're not giving the user control over how open they want to be with their real name and real profile details," he said. Gaon later added that the vast majority of the platform's users join via a sponsor, like an employer, that requires them to verify their membership and identity to join. The remaining users have joined without that level of vetting. 

Koko enforces anonymity, and it does not allow users to message each other directly, even though they routinely ask for the feature, Morris said.

"If we let people continue chatting and DMing with each other, retention and engagement would shoot up a ton, but it's just not what our aim is," he said. "The risk of these longer conversations, people being paired up, is just one we've never taken on."

Dr. Mishkind, a proponent of both high-quality peer support and technological innovation in mental health care, said that he would be hesitant to use any emotional support platform knowing that encounters could end in abuse, harassment, or predation.

"It's a huge risk to everybody associated with it," he said.  

Why consumers aren't protected from harm  

Despite the reality that consumers have painful or harmful experiences on emotional support platforms, the companies may bear no responsibility when this happens.  

Federal law known as Section 230 of the Communications Decency Act has long shielded online platforms from liability when their customers treat each other poorly. Notable exceptions include copyright law, illegal activity, sex trafficking, and child abuse that the company knew about and didn't attempt to stop. 

While Congress has raised the prospect of overhauling Section 230, particularly to improve child safety online, digital platforms can continue to invoke it as a defense against liability. 

At Mashable's request, Ari Ezra Waldman, a professor of law at the University of California, Irvine, reviewed the terms of service for the companies Mashable reported on and found very limited grounds for a lawsuit if a user sought recourse after experiencing harm. 

Waldman noted that this is a common reality of the "platform economy." 

He added that the business model of connecting people to strangers for "quasi mental health support" would be less likely to exist in a "world where platforms were more accountable to their users, and to the bad things that happened to their users." 

The Food and Drug Administration and Federal Trade Commission also do not have a clear or obvious role in regulating or enforcing actions against emotional support platforms. 

Attorney Carrie Goldberg believes accountability may be on the horizon. Last year, she sued the chat platform Omegle on behalf of a teenage girl who'd endured years of horrific digital abuse after being paired with a child predator. 

The case moved forward despite Omegle's efforts to shield itself from liability by citing Section 230. The judge found that Omegle could be held responsible for defective and negligent product design. Omegle settled the case, then shut down

"[T]here's not a culture where investors or founders are necessarily looking at the ways that a product can be abused, because they're going in arrogantly thinking that they're going to be immune from all harms that happen," Goldberg told Mashable. 

When 7 Cups lost its government contract in California, it led to a settlement agreement that prohibited either party from disclosing its existence and terms, unless under specific circumstances, like complying with government law. It's unclear whether the same thing could play out in the future with other emotional support platforms that partner with government agencies, should critical issues arise and lead to a terminated contract.

Mishkind said that companies offering a digital solution to mental health care access should be considered part of the system, and treated as such with clear regulation and rigorous independent evaluation, rather than as outsiders not subject to the same rules as other medical entities.

"I don't think we've quite wrapped our arms around that yet," Mishkind said. "There's this kind of protection around them because they are being seen as disruptors, but…we're all now part of the same system."

If you are a child being sexually exploited online, or you know a child who is being sexually exploited online, or you witnessed exploitation of a child occur online, you can report it to the CyberTipline, which is operated by the National Center for Missing Exploited & Children.

Rebecca Ruiz
Rebecca Ruiz

Rebecca Ruiz is a Senior Reporter at Mashable. She frequently covers mental health, digital culture, and technology. Her areas of expertise include suicide prevention, screen use and mental health, parenting, youth well-being, and meditation and mindfulness. Prior to Mashable, Rebecca was a staff writer, reporter, and editor at NBC News Digital, special reports project director at The American Prospect, and staff writer at Forbes. Rebecca has a B.A. from Sarah Lawrence College and a Master's in Journalism from U.C. Berkeley. In her free time, she enjoys playing soccer, watching movie trailers, traveling to places where she can't get cell service, and hiking with her border collie.


More from Bad Connections
California paid millions to access a mental health app. It wasn't safe for users.
The California state capitol in Sacramento.

Why online child exploitation is so hard to fight
Seen from behind, an illustrated man looks at a glowing computer screen.

Teens who talk about their mental health on this app may be taking a big risk
A group of people standing at a distance from each other.

Emotional support platform 7 Cups beset by trolls
A computer screen with trolling activity happening.

Recommended For You
How one man lost $56,000 when he opened an Amazon store
A man sits in front of a computer with his head in his hands.

WhatsApp 'View Once' messages are far more permanent than you realize
WhatsApp app



Bill Burr's trauma support group goes off the rails in 'SNL' sketch
asian man in leather jacket and brown button up shirt

More in Life
How to watch Australia vs. India 2nd Test online for free
Virat Kohli of India celebrates scoring a century

How to watch New Zealand vs. England 2nd Test online for free
England's Brydon Carse celebrates

How to watch the 2024 Abu Dhabi Grand Prix online for free
Sergio Perez of Mexico driving the Oracle Red Bull Racing

Amazon is giving two free Kindle books to Prime members in December
three book covers on a dark blue and purple background

How to watch Packers vs. Lions online for free
By Trisha Easto
Jordan Love of the Green Bay Packers

Trending on Mashable
NYT Connections hints today: Clues, answers for December 6, 2024
A phone displaying the New York Times game 'Connections.'

Wordle today: Answer, hints for December 6
a phone displaying Wordle

NYT Mini crossword answers, hints for December 6, 2024
Closeup view of crossword puzzle clues

Tesla suspends Cybertruck production. Who could have predicted this?
Tesla vehicles, including Cybertrucks, loaded on a transport that seems to be going nowhere.

At 2 a.m., an unexpected event led to a surprise planet discovery
A NASA conception of what the exoplanet Kepler-51e might look like.
The biggest stories of the day delivered to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Thanks for signing up. See you at your inbox!