Technology
The gray area where therapy apps, privacy policies, and third parties meet
Amanda Gumban was furious.
The 17-year-old was scrolling through Instagram when she came across an unexpected sight. It had been four months since her sister, 21-year-old Airman First Class Lindsey Renee Gumban, had died by suicide. And yet, on this day in November, Amanda was confronted with her sister’s face — staring back into hers in the form of an advertisement for a therapy app.
The Talkspace ad, which has since been removed, appeared to show Lindsey in conversation with a therapist. The two are depicted discussing common family struggles associated with the holidays.
“Our family dinner was really overwhelming,” the ad presents Lindsey as writing. “I feel like I can’t be my real self during the holidays.”
Amanda was positive those weren’t Lindsey’s words, and was as sure as she could be that her sister had never given permission for any headshots to be used in Talkspace ad campaigns.
It seemed to Amanda that her family’s loss was being used to market a therapy service.
So she and a group of sympathetic online commenters asked Talkspace to pull the ad by replying on Instagram and explaining the situation.
“Hundreds of comments of my helpful friends or acquaintances had asked they remove the photo,” she told us over Instagram direct message, “and Talkspace had no reply but kept on posting [the ad] on their profile and promoting their app.”
The situation as it stood was clearly untenable, but how it had come to this in the first place was perhaps the larger issue. How had Lindsey, who had never done any professional modeling work, ended up as the social-media face of this random company?
Amanda would find her answer by peeling back the digital layers of our modern world — a world where users are bound by permissive terms of service and data we submitted long ago is forever available for commercial harvest.
In the end, Talkspace’s own user rules wouldn’t be to blame — another site’s would — but don’t let that fool you. Amanda’s story is only the tip of a looming terms-of-service iceberg. An investigation into Talkspace’s policies raised vital questions concerning the power such agreements have over users’ personal information, and may cause you to rethink your relationship with your favorite app.
Or, at the very least, to finally start reading terms of service.
The rise of therapy apps
Talkspace was founded in 2012 by the husband and wife team of Oren and Roni Frank.
A 2014 Wall Street Journal article notes the two credit talk therapy with saving their own marriage, and that they launched the app with the hope of making the benefits of therapy more widely available to others.
“Our mission is to provide more people with convenient access to licensed therapists who can help those in need live a happier and healthier life,” reads the site’s FAQ.
The app works by connecting therapists with clients seeking help. Much like Uber claims it doesn’t directly employ drivers, Talkspace doesn’t itself employ therapists. Importantly, Talkspace insists its services are intended to act as supplemental therapy — not as a complete replacement for in-office visits — and emphasizes text-based conversations mediated via the app.
This, as you might imagine, means clients submit personal information — how they’re feeling, say, or what, if any, medication they’re taking — each time they log on for a therapy session.
The company’s website implores would-be customers to “join 1 million Talkspace users,” and the app looks to be growing in popularity — or, at the very least, in valuation. According to Crunchbase, Talkspace has received $106.7 million in funding since its launch. Its latest round was completed in May of this year.
And Talkspace is by no means the only player in this emerging space. BetterHelp, a competing service, brags on its website of having helped almost 740,000 clients.
That’s a lot of people putting their trust in the hands of app makers.
The privacy policy
Talkspace assures its users that it takes their privacy seriously.
“We go to great lengths to ensure that you and your data are always kept safe and confidential,” reads a section of the company’s FAQ. “All of your chat data is encrypted on the servers, and all communication between our software and the servers is encrypted.”
This is a good thing. If Talkspace’s servers were ever breached in some way, encryption could be the only thing standing between your privacy and your therapy sessions dumped online for the world to read. That’s because, with properly implemented encryption, the only people who can actually read what you write are you and the intended recipient — anyone trying to sneak a peek would just see gibberish.
But hackers aren’t the only possible threat that Talkspace users face. Buried deep within the app’s privacy policy lies a potentially problematic section regarding user-generated content.
Specifically, it warns of the following: “For content that is covered by intellectual property rights, like photos and videos you specifically grant Talkspace.com a non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use any IP content that you post on or in connection with Talkspace.com (IP License).”
Which, a common reading would suggest, means Talkspace can use the information that you submit however it wants — including for ads — with one extremely important exception. The privacy policy does state that the above clause “does NOT apply to photos, images or other videos shared ONLY with your Provider in your private ‘Room’ on the Platform.”
In other words, Talkspace is not going to take a conversation you had with a therapist — like about struggling through holiday family dinners — and turn it into an ad.
The privacy policy does, however, leave open the possibility that the company would take users’ profile pics and feature them in Instagram advertisements.
The gray area
Talkspace would not survive without a basic level of trust from its users. Feeling like one’s use of the app could be leveraged into marketing material would, on a fundamental level, betray that trust.
We reached out to the company to determine if it would consider, or had ever, used customers’ profile pictures for advertisements.
A spokesperson assured us that Talkspace would never do such a thing.
“Talkspace does not use photos of real users in any of its marketing materials, and would never do so without a user’s explicit consent,” wrote the spokesperson. “Preserving the anonymity of its users is of the utmost importance to Talkspace.”
But what about that clause in the privacy policy? A company spokesperson saying their employer would never consider doing a thing doesn’t feel completely reassuring if the privacy policy suggests it reserves the right to do otherwise.
James Dempsey, the executive director of the Berkeley Center for Law & Technology, addressed this seeming contradiction. He suggested that the varying requirements of copyright law, lawyers, and social media apps all factored into the questionable wording.
“On the one hand, [the privacy policy section in question] is innocuous and necessary: given the breadth of copyright law, users probably have a copyright in everything they write and post on a site, so the site could not operate unless users give a license to copy and use that content,” he explained after looking over the policy. “On the other hand, the license, being drafted by lawyers for the site, is usually very open-ended.”
So what does this actually mean for real world Talkspace users?
“This puts the site or app in the position of saying, ‘Oh, we would never do that,’ for something that is otherwise within the plain reading of the license,” continued Dempsey. “But I don’t think the license can be read to override all the other assurances in the privacy policy.”
In other words, users shouldn’t expect to see their profile pictures featured in ads — assuming they trust the company to never do what a “plain reading” of its privacy policy suggests it can.
BetterHelp, notably, takes a slightly different approach. The privacy policy doesn’t explicitly mention photos or videos like Talkspace’s policy does, but it does state that the data it gathers on you can be used to, among other things, “Market the Platform and Counselor Services to you.” As with Talkspace, you’ll have to trust the company that this doesn’t mean anything untoward.
At the present moment, however, trust happens to be in short supply. As services like Facebook repeatedly betray users’ confidence with data breach after data breach, people have rightly decided to maintain a level of healthy skepticism when it comes to tech companies’ access to their personal data.
A November survey from the Pew Research Center found that 79 percent of respondents were not confident a company would “publicly admit mistakes and take responsibility when they misuse or compromise their users’ personal data.”
With this in mind, Amanda’s frustration at Talkspace’s initial failure to remove her sister’s photo is unsurprising.
The third party
So how did Lindsey’s photo end up in a Talkspace ad on Instagram? Lindsey didn’t even have a Talkspace profile, so the company couldn’t have pulled it from a user account.
It took a little bit of digging, but Amanda and her family got to the bottom of the entire mess. Years ago one of Lindsey’s friends — a photographer by the name of Andy Roo — took her headshot. Roo used a service called Unsplash to build a portfolio, and it was there where Talkspace found Lindsey’s picture.
And yes, Unsplash’s license agreement gave Talkspace the right to use the photo, without charge, even for commercial purposes.
“All photos published on Unsplash can be used for free,” reads the license page. “You can use them for commercial and noncommercial purposes. You do not need to ask permission from or provide credit to the photographer or Unsplash, although it is appreciated when possible.”
A Talkspace spokesperson confirmed that the company had, indeed, taken Lindsey’s photo from a third-party site.
After repeated efforts to contact Talkspace, including by Roo who reached out to Talkspace directly, the company eventually pulled the ad featuring Lindsey’s photo. A spokesperson confirmed to Mashable that the ad had been removed.
Roo, for his part, hadn’t even realized his Unsplash account was still active — let alone that Talkspace was using one of his old photos in an ad. Interestingly, Roo isn’t the only photographer to be surprised by a website’s terms of service. Unsplash has an entire section of its FAQ dedicated to helping photographers remove their photos from third-party sites.
“Over the past few years, our community has expressed frustration with new websites that have been popping up and taking advantage of their work,” reads the section. “If you find your work (or another Unsplash contributor’s work) being distributed or sold on another site without consent, you have the absolute right to have it removed.”
It seems that, for some users, the site’s terms of service was overly permissive.
The repeat
While Amanda is pleased that Talkspace finally stopped using her sister’s headshot, she believes that by sourcing real photos from third-party sites the company is playing with fire. Something like this is bound to happen again.
“If I could talk to the [Talkspace] CEO, I’d advise they refrain from using pictures of real people that brings into question the privacy of these conversations [with] therapists via their app,” Amanda explained over DM. “Despite utilizing pictures from stock photos, it’s understandable it may have been chosen by random, but I’m still disturbed by how directly convenient her photo was to be used in a therapy app because she was a victim of suicide.”
And while Amanda told Mashable that she genuinely appreciates the app’s stated goal of making mental health services more widely available, the transition of those services to the digital space, and the social media marketing that comes with such a move, gives her pause.
“[No] one would want to stumble upon a picture of their loved one— especially a deceased one on a promotion,” she wrote.
Meanwhile, Talkspace has gone back to running the same holiday-themed ad on Instagram. The photo of Lindsey has simply been swapped with that of another woman.
If you want to talk to someone or are experiencing suicidal thoughts, text the Crisis Text Line at 741-741 or call the National Suicide Prevention Lifeline at 1-800-273-8255. For international resources, this list is a good place to start.
-
Entertainment7 days ago
‘Mufasa: The Lion King’ review: Can Barry Jenkins break the Disney machine?
-
Entertainment6 days ago
OpenAI’s plan to make ChatGPT the ‘everything app’ has never been more clear
-
Entertainment5 days ago
‘The Last Showgirl’ review: Pamela Anderson leads a shattering ensemble as an aging burlesque entertainer
-
Entertainment6 days ago
How to watch NFL Christmas Gameday and Beyoncé halftime
-
Entertainment5 days ago
Polyamorous influencer breakups: What happens when hypervisible relationships end
-
Entertainment4 days ago
‘The Room Next Door’ review: Tilda Swinton and Julianne Moore are magnificent
-
Entertainment3 days ago
‘The Wild Robot’ and ‘Flow’ are quietly revolutionary climate change films
-
Entertainment4 days ago
CES 2025 preview: What to expect