Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson I stopped supporting ATP because of their change from “AI is theft” to “you should pay $20 per month for ChatGPT.”
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson how dare you
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson "risk devaluing our entire profession..." To the people developing these models, this isn't a bug, its intended behavior.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson DragThing now by ChatGPT.
-
@jamesthomson "risk devaluing our entire profession..." To the people developing these models, this isn't a bug, its intended behavior.
@xenonchromatic Indeed.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
I know a manager who suggested someone use AI to write a simple description of a piece of code.
Nuts.
-
@amyinorbit I mean, aside from the personal, ethical, societal, financial, and environmental issues, it's just great.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson I think the problem is developers don't really consider any unattributed use of open source as stealing - just a mild grey area. (They should consider it stealing.)
-
@jamesthomson I stopped supporting ATP because of their change from “AI is theft” to “you should pay $20 per month for ChatGPT.”
@the_other_jon @jamesthomson John Siracusa seems to offer the most balanced perspective there, but yeah it's pretty grim. I mostly don't listen any more. And don't get me started on MacStories omfg

-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson Also developers: Vibe-coding generative AI models were built on our stolen source code, we're all being laid off, please hire us.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson
I would be interested in training a model based on my own code. I spend a decent amount of time looking through my own code to find something I know I’ve done before. -
@jamesthomson I think the problem is developers don't really consider any unattributed use of open source as stealing - just a mild grey area. (They should consider it stealing.)
@colincornaby @jamesthomson Open source != public domain, and free software != free (it's free speech, not free beer), but apparently many developers are clueless re: all those nuances.
️Perhaps if all LLM-generated code was legally automatically placed in public domain, we'd see a bit of a light bulb moment.

-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson I’ve been struggling with this cognitive dissonance for years.
-
@jamesthomson
I would be interested in training a model based on my own code. I spend a decent amount of time looking through my own code to find something I know I’ve done before.@estranged My understanding is that there isn't enough data in a small set like that to actually train usefully, as a standalone thing. So it's always going to be based on other models, with your training on top.
-
R relay@relay.an.exchange shared this topic