I take substantial issue with this.
Uncategorized
1
Posts
1
Posters
0
Views
-
RE: https://c.im/@cdarwin/116327241614183410
I take substantial issue with this.
"Psychosis" is something humans develop.
LLMs cannot develop "psychosis" in any way.
It is incapable of "hallucinating" either.
That is improper anthropomorphisation.
The reality is, LLMs generate inaccuracies, synthesises incorrect data, etc., but these are not "hallucinations."
People hallucinate. Machines do not.
-
R relay@relay.infosec.exchange shared this topic