1. Invent an illness;2.
-
1. Invent an illness;
2. Put up a couple of preprints about it;
3. See AI companies' web crawlers swallow it hook, line, and sinker whole;
4. LLMs start telling users that the condition is real. WTF.
5. See published papers cite them, likely from authors using LLMs to write them. WTAF.@albertcardona "you've got RWSD. Right-wing super dumb. You got infected by your AI girlfriend"
-
1. Invent an illness;
2. Put up a couple of preprints about it;
3. See AI companies' web crawlers swallow it hook, line, and sinker whole;
4. LLMs start telling users that the condition is real. WTF.
5. See published papers cite them, likely from authors using LLMs to write them. WTAF. -
1. Invent an illness;
2. Put up a couple of preprints about it;
3. See AI companies' web crawlers swallow it hook, line, and sinker whole;
4. LLMs start telling users that the condition is real. WTF.
5. See published papers cite them, likely from authors using LLMs to write them. WTAF.@albertcardona
#ArtificialStupidity : Fake disease edition -
1. Invent an illness;
2. Put up a couple of preprints about it;
3. See AI companies' web crawlers swallow it hook, line, and sinker whole;
4. LLMs start telling users that the condition is real. WTF.
5. See published papers cite them, likely from authors using LLMs to write them. WTAF.@albertcardona I wish my own papers were taken up by the literature that easily..
-
1. Invent an illness;
2. Put up a couple of preprints about it;
3. See AI companies' web crawlers swallow it hook, line, and sinker whole;
4. LLMs start telling users that the condition is real. WTF.
5. See published papers cite them, likely from authors using LLMs to write them. WTAF.@albertcardona @hjhornbeck Lab leak. New kind.
-
@albertcardona I wish my own papers were taken up by the literature that easily..
Your papers don’t cater to the gullible hypochondriacs that seek medical advice from a chatbot.
-
1. Invent an illness;
2. Put up a couple of preprints about it;
3. See AI companies' web crawlers swallow it hook, line, and sinker whole;
4. LLMs start telling users that the condition is real. WTF.
5. See published papers cite them, likely from authors using LLMs to write them. WTAF.But ... what if some idiot hooks up AI to gear that starts *synthesizing* it....
-
1. Invent an illness;
2. Put up a couple of preprints about it;
3. See AI companies' web crawlers swallow it hook, line, and sinker whole;
4. LLMs start telling users that the condition is real. WTF.
5. See published papers cite them, likely from authors using LLMs to write them. WTAF. -
1. Invent an illness;
2. Put up a couple of preprints about it;
3. See AI companies' web crawlers swallow it hook, line, and sinker whole;
4. LLMs start telling users that the condition is real. WTF.
5. See published papers cite them, likely from authors using LLMs to write them. WTAF. -
#Bixonimania on Wikipedia + Wikidata:
-
R relay@relay.mycrowd.ca shared this topic