Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. This is especially insidious because Talkspace is largely an asynchronous text-based service.

This is especially insidious because Talkspace is largely an asynchronous text-based service.

Scheduled Pinned Locked Moved Uncategorized
13 Posts 4 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • _l1vy_@mstdn.social_ This user is from outside of this forum
    _l1vy_@mstdn.social_ This user is from outside of this forum
    _l1vy_@mstdn.social
    wrote last edited by
    #1

    This is especially insidious because Talkspace is largely an asynchronous text-based service. (Not just virtually meeting online on a secure telehealth platform to talk.) So *everything* would have been communicated within these texts:

    https://www.proofnews.org/womans-talkspace-therapy-app-sessions-exposed-in-court/

    lawyersgunsnmoney@mstdn.socialL _l1vy_@mstdn.social_ crankylinuxuser@infosec.exchangeC 3 Replies Last reply
    1
    0
    • _l1vy_@mstdn.social_ _l1vy_@mstdn.social

      This is especially insidious because Talkspace is largely an asynchronous text-based service. (Not just virtually meeting online on a secure telehealth platform to talk.) So *everything* would have been communicated within these texts:

      https://www.proofnews.org/womans-talkspace-therapy-app-sessions-exposed-in-court/

      lawyersgunsnmoney@mstdn.socialL This user is from outside of this forum
      lawyersgunsnmoney@mstdn.socialL This user is from outside of this forum
      lawyersgunsnmoney@mstdn.social
      wrote last edited by
      #2

      @_L1vY_ Yeah that is effed up. The article said that Talkspace is using all the therapy sessions to train their soon to be released AI therapy bot…

      _l1vy_@mstdn.social_ 1 Reply Last reply
      1
      0
      • lawyersgunsnmoney@mstdn.socialL lawyersgunsnmoney@mstdn.social

        @_L1vY_ Yeah that is effed up. The article said that Talkspace is using all the therapy sessions to train their soon to be released AI therapy bot…

        _l1vy_@mstdn.social_ This user is from outside of this forum
        _l1vy_@mstdn.social_ This user is from outside of this forum
        _l1vy_@mstdn.social
        wrote last edited by
        #3

        @lawyersgunsnmoney Yeah. Jesus. 😤

        1 Reply Last reply
        0
        • _l1vy_@mstdn.social_ _l1vy_@mstdn.social

          This is especially insidious because Talkspace is largely an asynchronous text-based service. (Not just virtually meeting online on a secure telehealth platform to talk.) So *everything* would have been communicated within these texts:

          https://www.proofnews.org/womans-talkspace-therapy-app-sessions-exposed-in-court/

          _l1vy_@mstdn.social_ This user is from outside of this forum
          _l1vy_@mstdn.social_ This user is from outside of this forum
          _l1vy_@mstdn.social
          wrote last edited by
          #4

          "By end of last year, the platform boasted approximately 200 million eligible patients. Their conversations form the basis of Talkspace’s vast mental health database. Speaking at healthcare INVESTMENT conference last yr, Talkspace CEO Jon Cohen said platform had compiled '8 billion words, 140 million messages, 6.2 million assessments.'

          The data trains a 'therapy companion' chatbot slated to be released later this yr...the company wants to secure insurance reimbursement for the automated tool."

          lawyersgunsnmoney@mstdn.socialL wronglang@bayes.clubW 2 Replies Last reply
          1
          0
          • _l1vy_@mstdn.social_ _l1vy_@mstdn.social

            "By end of last year, the platform boasted approximately 200 million eligible patients. Their conversations form the basis of Talkspace’s vast mental health database. Speaking at healthcare INVESTMENT conference last yr, Talkspace CEO Jon Cohen said platform had compiled '8 billion words, 140 million messages, 6.2 million assessments.'

            The data trains a 'therapy companion' chatbot slated to be released later this yr...the company wants to secure insurance reimbursement for the automated tool."

            lawyersgunsnmoney@mstdn.socialL This user is from outside of this forum
            lawyersgunsnmoney@mstdn.socialL This user is from outside of this forum
            lawyersgunsnmoney@mstdn.social
            wrote last edited by
            #5

            @_L1vY_ I wonder if a professional board (I know there are several types) would award a license to practice to a bot 🧐 Seems like to get reimbursed by insurance the provider has to be licensed…to provide therapy…Is the bot providing therapy? I could shake a Magic 8 Ball, give a patient an answer and then bill insurance. I think the CEO is high on his own supply with that idea and is a douche for expropriating people’s HIPAA data

            _l1vy_@mstdn.social_ 1 Reply Last reply
            0
            • lawyersgunsnmoney@mstdn.socialL lawyersgunsnmoney@mstdn.social

              @_L1vY_ I wonder if a professional board (I know there are several types) would award a license to practice to a bot 🧐 Seems like to get reimbursed by insurance the provider has to be licensed…to provide therapy…Is the bot providing therapy? I could shake a Magic 8 Ball, give a patient an answer and then bill insurance. I think the CEO is high on his own supply with that idea and is a douche for expropriating people’s HIPAA data

              _l1vy_@mstdn.social_ This user is from outside of this forum
              _l1vy_@mstdn.social_ This user is from outside of this forum
              _l1vy_@mstdn.social
              wrote last edited by
              #6

              @lawyersgunsnmoney How that would almost certainly go--aside from massive lobbying--would be operating the thing under the license(s) of some particular clinician(s) who would then be legally responsible for the automaton's action. And certainly the company would try to avoid responsibility!

              lawyersgunsnmoney@mstdn.socialL 1 Reply Last reply
              0
              • _l1vy_@mstdn.social_ _l1vy_@mstdn.social

                @lawyersgunsnmoney How that would almost certainly go--aside from massive lobbying--would be operating the thing under the license(s) of some particular clinician(s) who would then be legally responsible for the automaton's action. And certainly the company would try to avoid responsibility!

                lawyersgunsnmoney@mstdn.socialL This user is from outside of this forum
                lawyersgunsnmoney@mstdn.socialL This user is from outside of this forum
                lawyersgunsnmoney@mstdn.social
                wrote last edited by
                #7

                @_L1vY_ I’m sure you’re right but I can’t imagine a therapist putting their license on the line and letting an LLM practice. And the professional liability insurers - seems like that would be an exclusion from coverage really fast. One wrongful death case…It’s the same as for other professions as well. For lawyers there is no tolerance for that from the courts from what I’ve seen. But hey there are strong economic incentives to unemploy people so 🤷‍♂️

                _l1vy_@mstdn.social_ 1 Reply Last reply
                0
                • lawyersgunsnmoney@mstdn.socialL lawyersgunsnmoney@mstdn.social

                  @_L1vY_ I’m sure you’re right but I can’t imagine a therapist putting their license on the line and letting an LLM practice. And the professional liability insurers - seems like that would be an exclusion from coverage really fast. One wrongful death case…It’s the same as for other professions as well. For lawyers there is no tolerance for that from the courts from what I’ve seen. But hey there are strong economic incentives to unemploy people so 🤷‍♂️

                  _l1vy_@mstdn.social_ This user is from outside of this forum
                  _l1vy_@mstdn.social_ This user is from outside of this forum
                  _l1vy_@mstdn.social
                  wrote last edited by
                  #8

                  @lawyersgunsnmoney One would think! But people go out of their depth doing supervision all the time. Plus a lot of people in MH are not tech savvy, just enough licensees might be persuaded or believe they're being cutting edge.

                  1 Reply Last reply
                  0
                  • _l1vy_@mstdn.social_ _l1vy_@mstdn.social

                    This is especially insidious because Talkspace is largely an asynchronous text-based service. (Not just virtually meeting online on a secure telehealth platform to talk.) So *everything* would have been communicated within these texts:

                    https://www.proofnews.org/womans-talkspace-therapy-app-sessions-exposed-in-court/

                    crankylinuxuser@infosec.exchangeC This user is from outside of this forum
                    crankylinuxuser@infosec.exchangeC This user is from outside of this forum
                    crankylinuxuser@infosec.exchange
                    wrote last edited by
                    #9

                    @L1vY@mstdn.social

                    Goes back to a few things ive said.

                    1. Dont trust anybody in the psychology/psychiatry professions. Anything you say can and will be used against you. Legal defenses can be broken by judge.

                    2. Most mental health issues is "doesnt make enough money". If people had enough money to live, most of these stress based mental health issues would quickly evaporate.

                    _l1vy_@mstdn.social_ 1 Reply Last reply
                    0
                    • crankylinuxuser@infosec.exchangeC crankylinuxuser@infosec.exchange

                      @L1vY@mstdn.social

                      Goes back to a few things ive said.

                      1. Dont trust anybody in the psychology/psychiatry professions. Anything you say can and will be used against you. Legal defenses can be broken by judge.

                      2. Most mental health issues is "doesnt make enough money". If people had enough money to live, most of these stress based mental health issues would quickly evaporate.

                      _l1vy_@mstdn.social_ This user is from outside of this forum
                      _l1vy_@mstdn.social_ This user is from outside of this forum
                      _l1vy_@mstdn.social
                      wrote last edited by
                      #10

                      @crankylinuxuser I both agree and disagree with you

                      crankylinuxuser@infosec.exchangeC 1 Reply Last reply
                      0
                      • _l1vy_@mstdn.social_ _l1vy_@mstdn.social

                        @crankylinuxuser I both agree and disagree with you

                        crankylinuxuser@infosec.exchangeC This user is from outside of this forum
                        crankylinuxuser@infosec.exchangeC This user is from outside of this forum
                        crankylinuxuser@infosec.exchange
                        wrote last edited by
                        #11

                        @L1vY@mstdn.social

                        I had a friend year ago who was a commercial pilot. Went through a tough patch, divorce. That sort of thing, that therapy really helps.

                        Except FAA rules demand to know if you have been to any psychologists/psychiatrists. The FAA weaponizes that against pilots upon threat of perjury. (Federal clearance applications also asks this exactly.)

                        The only way to get around that is to find a private mental health practitioner, do not use your insurance, lie about your name, and pay cash.

                        Its still technically perjury but the point is the FAA cant prove anything.

                        _l1vy_@mstdn.social_ 1 Reply Last reply
                        0
                        • crankylinuxuser@infosec.exchangeC crankylinuxuser@infosec.exchange

                          @L1vY@mstdn.social

                          I had a friend year ago who was a commercial pilot. Went through a tough patch, divorce. That sort of thing, that therapy really helps.

                          Except FAA rules demand to know if you have been to any psychologists/psychiatrists. The FAA weaponizes that against pilots upon threat of perjury. (Federal clearance applications also asks this exactly.)

                          The only way to get around that is to find a private mental health practitioner, do not use your insurance, lie about your name, and pay cash.

                          Its still technically perjury but the point is the FAA cant prove anything.

                          _l1vy_@mstdn.social_ This user is from outside of this forum
                          _l1vy_@mstdn.social_ This user is from outside of this forum
                          _l1vy_@mstdn.social
                          wrote last edited by
                          #12

                          @crankylinuxuser
                          Not sure how the FAA got into the discussion 😆 BUT. Yes, if you have had a diagnosis, the FAA requires you to attend sessions and also undergo assessments with multiple clinicians, not only report whether you have been to sessions. They do have very stringent requirements regarding who is approved for a pilot's license, including rule-outs of having been prescribed certain classes of medications and specific medications, some of which are pretty common antidepressants.

                          1 Reply Last reply
                          0
                          • _l1vy_@mstdn.social_ _l1vy_@mstdn.social

                            "By end of last year, the platform boasted approximately 200 million eligible patients. Their conversations form the basis of Talkspace’s vast mental health database. Speaking at healthcare INVESTMENT conference last yr, Talkspace CEO Jon Cohen said platform had compiled '8 billion words, 140 million messages, 6.2 million assessments.'

                            The data trains a 'therapy companion' chatbot slated to be released later this yr...the company wants to secure insurance reimbursement for the automated tool."

                            wronglang@bayes.clubW This user is from outside of this forum
                            wronglang@bayes.clubW This user is from outside of this forum
                            wronglang@bayes.club
                            wrote last edited by
                            #13

                            @_L1vY_ well, that's a nightmare!

                            1 Reply Last reply
                            0
                            • R relay@relay.mycrowd.ca shared this topic
                            Reply
                            • Reply as topic
                            Log in to reply
                            • Oldest to Newest
                            • Newest to Oldest
                            • Most Votes


                            • Login

                            • Login or register to search.
                            • First post
                              Last post
                            0
                            • Categories
                            • Recent
                            • Tags
                            • Popular
                            • World
                            • Users
                            • Groups