"YOU WILL NEVER BELIEVE WHY HURTING MAKES YOU SAD!"
mrberard@mastodon.acm.org
Posts
-
"YOU WILL NEVER BELIEVE WHY HURTING MAKES YOU SAD!" -
I wish I could recommend this piece more, because it makes a bunch of great points, but the "normal technology" case feels misleading to me.Again, I am not disagreeing with this point, just with the practical utility of choosing to use the term based on it.
-
I wish I could recommend this piece more, because it makes a bunch of great points, but the "normal technology" case feels misleading to me.That's an interesting example, because my understanding is that hearing voices is more common than people think, and often not accompanied by the symptom cluster that would lead to a psychosis diagnosis.
I think the problem is the underlying model for diagnostic criteria, which was already defective IMO even before AI complicated the picture.
Lexically, a single term blurs the nuances. For a broader, umbrella term, 'AI brainrot' seems more appropriate IMO.
-
I wish I could recommend this piece more, because it makes a bunch of great points, but the "normal technology" case feels misleading to me.Agreed. But it's the subtle influence on user's views I'm referring to. Which was a social media problem before it was an AI issue.
Sure, we can categorise this as "delusions", but I don't know that bundling everything as 'psychosis' helps the debate, in that it flattens the nuances between subtle and overt cases.
Ultimately, we're tying to apply a medical model designed before mass media , DSM updates notwithstanding. Not surprising it reaches the limits of its utility.
-
I wish I could recommend this piece more, because it makes a bunch of great points, but the "normal technology" case feels misleading to me.... Cory being perhaps a case in point.
-
I wish I could recommend this piece more, because it makes a bunch of great points, but the "normal technology" case feels misleading to me.AI psychosis is appropriate to the cases that stray into psychosis, for sure.
The point here is that foregrounding these cases glossed over all the more subtle cases of affecting the users perception of reality, and these are far more dangerous, if anything by their sheer numbers.
-
I wish I could recommend this piece more, because it makes a bunch of great points, but the "normal technology" case feels misleading to me.