RE: https://c.im/@cdarwin/116327241614183410I take substantial issue with this."Psychosis" is something humans develop.LLMs cannot develop "psychosis" in any way.It is incapable of "hallucinating" either.That is improper anthropomorphisation.The reality is, LLMs generate inaccuracies, synthesises incorrect data, etc., but these are not "hallucinations."People hallucinate. Machines do not.