Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. https://www.linkedin.com/posts/marie-potel-saville_tiktok-knows-exactly-how-much-time-it-takes-share-7450423802775285760-y2Q3

https://www.linkedin.com/posts/marie-potel-saville_tiktok-knows-exactly-how-much-time-it-takes-share-7450423802775285760-y2Q3

Scheduled Pinned Locked Moved Uncategorized
tiktok
7 Posts 3 Posters 2 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • paninid@mastodon.worldP This user is from outside of this forum
    paninid@mastodon.worldP This user is from outside of this forum
    paninid@mastodon.world
    wrote last edited by
    #1

    https://www.linkedin.com/posts/marie-potel-saville_tiktok-knows-exactly-how-much-time-it-takes-share-7450423802775285760-y2Q3

    #TikTok knows exactly how much time it takes to get you addicted to their algorithm : 35 minutes.

    According to internal documents revealed in a lawsuit, a user is likely to become addicted after 260 videos.

    At 8 seconds per video, that's ~35 minutes.

    We only know this because of a legal accident. In 2024, 14 US attorneys general sued TikTok for deliberately addicting teenagers.

    (1/5)

    paninid@mastodon.worldP 1 Reply Last reply
    2
    0
    • paninid@mastodon.worldP paninid@mastodon.world

      In one of the lawsuits, the redactions were faulty and 30 pages of internal documents became public.

      What they revealed is hard to read.

      TikTok's own research found that “compulsive usage correlates with loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety”.

      They knew, they documented it and chose to keep building anyway.

      (2/5)

      paninid@mastodon.worldP This user is from outside of this forum
      paninid@mastodon.worldP This user is from outside of this forum
      paninid@mastodon.world
      wrote last edited by
      #2

      Instead of trying to reduce screen time among teenagers, they built time-management tools to improve "public trust in the TikTok platform via media coverage."

      The tobacco industry used the same playbook for 40 years, they called it "problematic use" too and tried to shift responsibility to the consumers.

      But we now have the documents, and the courts are starting to use them.

      (3/5)

      paninid@mastodon.worldP 1 Reply Last reply
      0
      • paninid@mastodon.worldP paninid@mastodon.world

        https://www.linkedin.com/posts/marie-potel-saville_tiktok-knows-exactly-how-much-time-it-takes-share-7450423802775285760-y2Q3

        #TikTok knows exactly how much time it takes to get you addicted to their algorithm : 35 minutes.

        According to internal documents revealed in a lawsuit, a user is likely to become addicted after 260 videos.

        At 8 seconds per video, that's ~35 minutes.

        We only know this because of a legal accident. In 2024, 14 US attorneys general sued TikTok for deliberately addicting teenagers.

        (1/5)

        paninid@mastodon.worldP This user is from outside of this forum
        paninid@mastodon.worldP This user is from outside of this forum
        paninid@mastodon.world
        wrote last edited by
        #3

        In one of the lawsuits, the redactions were faulty and 30 pages of internal documents became public.

        What they revealed is hard to read.

        TikTok's own research found that “compulsive usage correlates with loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety”.

        They knew, they documented it and chose to keep building anyway.

        (2/5)

        paninid@mastodon.worldP 1 Reply Last reply
        0
        • paninid@mastodon.worldP paninid@mastodon.world

          Instead of trying to reduce screen time among teenagers, they built time-management tools to improve "public trust in the TikTok platform via media coverage."

          The tobacco industry used the same playbook for 40 years, they called it "problematic use" too and tried to shift responsibility to the consumers.

          But we now have the documents, and the courts are starting to use them.

          (3/5)

          paninid@mastodon.worldP This user is from outside of this forum
          paninid@mastodon.worldP This user is from outside of this forum
          paninid@mastodon.world
          wrote last edited by
          #4

          TikTok is not an isolated case: Meta, YouTube and others use the same logic. They all have the same type of internal documents, and make the same choice everyday.

          When an entire market is designed to exploit human cognitive weaknesses at scale, it’s no longer a market in the economic sense (ie optimal allocation of resources and best benefits for consumers), it’s just a predatory system.

          I now call it « predatory design ».

          (4/5)

          paninid@mastodon.worldP 1 Reply Last reply
          2
          0
          • paninid@mastodon.worldP paninid@mastodon.world

            TikTok is not an isolated case: Meta, YouTube and others use the same logic. They all have the same type of internal documents, and make the same choice everyday.

            When an entire market is designed to exploit human cognitive weaknesses at scale, it’s no longer a market in the economic sense (ie optimal allocation of resources and best benefits for consumers), it’s just a predatory system.

            I now call it « predatory design ».

            (4/5)

            paninid@mastodon.worldP This user is from outside of this forum
            paninid@mastodon.worldP This user is from outside of this forum
            paninid@mastodon.world
            wrote last edited by
            #5

            Of course, regulation is necessary. But laws are not enough alone, and fines cannot undo a decade of engineered addiction.

            The only answer to a systemic problem is a systemic solution: technology that puts human autonomy back at the center of digital design.

            We need Human Safety Tech, to protect all citizens and especially the youngest, that are more vulnerable.
            (5/5)

            manyroads@mstdn.socialM stephaniemoore@mastodon.onlineS 2 Replies Last reply
            0
            • paninid@mastodon.worldP paninid@mastodon.world

              Of course, regulation is necessary. But laws are not enough alone, and fines cannot undo a decade of engineered addiction.

              The only answer to a systemic problem is a systemic solution: technology that puts human autonomy back at the center of digital design.

              We need Human Safety Tech, to protect all citizens and especially the youngest, that are more vulnerable.
              (5/5)

              manyroads@mstdn.socialM This user is from outside of this forum
              manyroads@mstdn.socialM This user is from outside of this forum
              manyroads@mstdn.social
              wrote last edited by
              #6

              @paninid This falls in line with a long list of marketing driven global destruction... tobacco, sugar, fats, gasoline, pain killers, ....

              1 Reply Last reply
              0
              • System shared this topic
                R relay@relay.mycrowd.ca shared this topic
                R relay@relay.infosec.exchange shared this topic
              • paninid@mastodon.worldP paninid@mastodon.world

                Of course, regulation is necessary. But laws are not enough alone, and fines cannot undo a decade of engineered addiction.

                The only answer to a systemic problem is a systemic solution: technology that puts human autonomy back at the center of digital design.

                We need Human Safety Tech, to protect all citizens and especially the youngest, that are more vulnerable.
                (5/5)

                stephaniemoore@mastodon.onlineS This user is from outside of this forum
                stephaniemoore@mastodon.onlineS This user is from outside of this forum
                stephaniemoore@mastodon.online
                wrote last edited by
                #7

                @paninid what does “Human safety tech” mean to you?

                1 Reply Last reply
                0
                • R relay@relay.publicsquare.global shared this topic
                Reply
                • Reply as topic
                Log in to reply
                • Oldest to Newest
                • Newest to Oldest
                • Most Votes


                • Login

                • Login or register to search.
                • First post
                  Last post
                0
                • Categories
                • Recent
                • Tags
                • Popular
                • World
                • Users
                • Groups