This is especially insidious because Talkspace is largely an asynchronous text-based service.
-
This is especially insidious because Talkspace is largely an asynchronous text-based service. (Not just virtually meeting online on a secure telehealth platform to talk.) So *everything* would have been communicated within these texts:
https://www.proofnews.org/womans-talkspace-therapy-app-sessions-exposed-in-court/
@_L1vY_ Yeah that is effed up. The article said that Talkspace is using all the therapy sessions to train their soon to be released AI therapy bot…
-
@_L1vY_ Yeah that is effed up. The article said that Talkspace is using all the therapy sessions to train their soon to be released AI therapy bot…
@lawyersgunsnmoney Yeah. Jesus.

-
This is especially insidious because Talkspace is largely an asynchronous text-based service. (Not just virtually meeting online on a secure telehealth platform to talk.) So *everything* would have been communicated within these texts:
https://www.proofnews.org/womans-talkspace-therapy-app-sessions-exposed-in-court/
"By end of last year, the platform boasted approximately 200 million eligible patients. Their conversations form the basis of Talkspace’s vast mental health database. Speaking at healthcare INVESTMENT conference last yr, Talkspace CEO Jon Cohen said platform had compiled '8 billion words, 140 million messages, 6.2 million assessments.'
The data trains a 'therapy companion' chatbot slated to be released later this yr...the company wants to secure insurance reimbursement for the automated tool."
-
"By end of last year, the platform boasted approximately 200 million eligible patients. Their conversations form the basis of Talkspace’s vast mental health database. Speaking at healthcare INVESTMENT conference last yr, Talkspace CEO Jon Cohen said platform had compiled '8 billion words, 140 million messages, 6.2 million assessments.'
The data trains a 'therapy companion' chatbot slated to be released later this yr...the company wants to secure insurance reimbursement for the automated tool."
@_L1vY_ I wonder if a professional board (I know there are several types) would award a license to practice to a bot 🧐 Seems like to get reimbursed by insurance the provider has to be licensed…to provide therapy…Is the bot providing therapy? I could shake a Magic 8 Ball, give a patient an answer and then bill insurance. I think the CEO is high on his own supply with that idea and is a douche for expropriating people’s HIPAA data
-
@_L1vY_ I wonder if a professional board (I know there are several types) would award a license to practice to a bot 🧐 Seems like to get reimbursed by insurance the provider has to be licensed…to provide therapy…Is the bot providing therapy? I could shake a Magic 8 Ball, give a patient an answer and then bill insurance. I think the CEO is high on his own supply with that idea and is a douche for expropriating people’s HIPAA data
@lawyersgunsnmoney How that would almost certainly go--aside from massive lobbying--would be operating the thing under the license(s) of some particular clinician(s) who would then be legally responsible for the automaton's action. And certainly the company would try to avoid responsibility!
-
@lawyersgunsnmoney How that would almost certainly go--aside from massive lobbying--would be operating the thing under the license(s) of some particular clinician(s) who would then be legally responsible for the automaton's action. And certainly the company would try to avoid responsibility!
@_L1vY_ I’m sure you’re right but I can’t imagine a therapist putting their license on the line and letting an LLM practice. And the professional liability insurers - seems like that would be an exclusion from coverage really fast. One wrongful death case…It’s the same as for other professions as well. For lawyers there is no tolerance for that from the courts from what I’ve seen. But hey there are strong economic incentives to unemploy people so
️ -
@_L1vY_ I’m sure you’re right but I can’t imagine a therapist putting their license on the line and letting an LLM practice. And the professional liability insurers - seems like that would be an exclusion from coverage really fast. One wrongful death case…It’s the same as for other professions as well. For lawyers there is no tolerance for that from the courts from what I’ve seen. But hey there are strong economic incentives to unemploy people so
️@lawyersgunsnmoney One would think! But people go out of their depth doing supervision all the time. Plus a lot of people in MH are not tech savvy, just enough licensees might be persuaded or believe they're being cutting edge.
-
This is especially insidious because Talkspace is largely an asynchronous text-based service. (Not just virtually meeting online on a secure telehealth platform to talk.) So *everything* would have been communicated within these texts:
https://www.proofnews.org/womans-talkspace-therapy-app-sessions-exposed-in-court/
@L1vY@mstdn.social
Goes back to a few things ive said.
Dont trust anybody in the psychology/psychiatry professions. Anything you say can and will be used against you. Legal defenses can be broken by judge.
Most mental health issues is "doesnt make enough money". If people had enough money to live, most of these stress based mental health issues would quickly evaporate.
-
@L1vY@mstdn.social
Goes back to a few things ive said.
Dont trust anybody in the psychology/psychiatry professions. Anything you say can and will be used against you. Legal defenses can be broken by judge.
Most mental health issues is "doesnt make enough money". If people had enough money to live, most of these stress based mental health issues would quickly evaporate.
@crankylinuxuser I both agree and disagree with you
-
@crankylinuxuser I both agree and disagree with you
@L1vY@mstdn.social
I had a friend year ago who was a commercial pilot. Went through a tough patch, divorce. That sort of thing, that therapy really helps.
Except FAA rules demand to know if you have been to any psychologists/psychiatrists. The FAA weaponizes that against pilots upon threat of perjury. (Federal clearance applications also asks this exactly.)
The only way to get around that is to find a private mental health practitioner, do not use your insurance, lie about your name, and pay cash.
Its still technically perjury but the point is the FAA cant prove anything.
-
@L1vY@mstdn.social
I had a friend year ago who was a commercial pilot. Went through a tough patch, divorce. That sort of thing, that therapy really helps.
Except FAA rules demand to know if you have been to any psychologists/psychiatrists. The FAA weaponizes that against pilots upon threat of perjury. (Federal clearance applications also asks this exactly.)
The only way to get around that is to find a private mental health practitioner, do not use your insurance, lie about your name, and pay cash.
Its still technically perjury but the point is the FAA cant prove anything.
@crankylinuxuser
Not sure how the FAA got into the discussion
BUT. Yes, if you have had a diagnosis, the FAA requires you to attend sessions and also undergo assessments with multiple clinicians, not only report whether you have been to sessions. They do have very stringent requirements regarding who is approved for a pilot's license, including rule-outs of having been prescribed certain classes of medications and specific medications, some of which are pretty common antidepressants. -
"By end of last year, the platform boasted approximately 200 million eligible patients. Their conversations form the basis of Talkspace’s vast mental health database. Speaking at healthcare INVESTMENT conference last yr, Talkspace CEO Jon Cohen said platform had compiled '8 billion words, 140 million messages, 6.2 million assessments.'
The data trains a 'therapy companion' chatbot slated to be released later this yr...the company wants to secure insurance reimbursement for the automated tool."
@_L1vY_ well, that's a nightmare!
-
R relay@relay.mycrowd.ca shared this topic