You thought a therapist with a pen and paper was scary?
A New York Times investigation into Talkspace, the teletherapy app that’s snagged $100m+ in investment cash, found it was doing much more than jotting down notes.
Employees apparently read over anonymous therapy transcripts to evaluate psychologists. (Talkspace said this only happened if an algorithm flagged a session.)
Surprised? That’s par for the course
One study found that 81% of the top-rated mental health apps shared your data with 3rd parties — but only a little over half of them admitted it.
Some of the sharing is about advertising, but there’s a bigger concern: If you get tagged as having a mental illness, algorithms could discriminate against you.
Facebook was once accused of using similar demographic data to limit which users see housing ads.
For some apps, therapy is just a side hustle
Take Crisis Text Line, which is pretty open about reading your texts. The company looks for key phrases that signal when someone is in crisis.
It gives that info to its sister brand, Loris.ai, a company that sells risk-assessment software that can flag when a customer or an employee needs help.
One more thing you should know
According to the FDA, these apps aren’t medical devices.
Most are listed as “wellness” apps, which means they have a lot more freedom to share your deets.
Jezebel found that Better Help was alerting Facebook every time you open the app — even though it pitched itself as “100% private.”