Recently, I mused over what it would take, from my perspective, to significantly change my view that the tech industry's infatuation with non-intelligent "intelligence" is a net-negative for society.
-
Recently, I mused over what it would take, from my perspective, to significantly change my view that the tech industry's infatuation with non-intelligent "intelligence" is a net-negative for society.
What Would It Take
An exercise to see what it would take to change my mind about AI.
Python By Night (pythonbynight.com)
Below are a few choice quotes from my post. (a sort of TLDR)
1/
The leaders of this technology are categorically unethical and detached from society, and I believe their leadership is taking us into a xenophobic future only fit for technocrats subsisting off of slave labor.
2/
-
The leaders of this technology are categorically unethical and detached from society, and I believe their leadership is taking us into a xenophobic future only fit for technocrats subsisting off of slave labor.
2/
Deceptive designs that profit off of anthropomorphism, and dark patterns used to gather private data should be outlawed. (This would have the added benefit of also crippling the predatory ad-tech industry.)
3/
-
Deceptive designs that profit off of anthropomorphism, and dark patterns used to gather private data should be outlawed. (This would have the added benefit of also crippling the predatory ad-tech industry.)
3/
I would need to see a transparent attempt to compensate the "humans in the loop" with salaries commensurate to the tasks that they are asked to perform, as well as benefits for any mental health strain or other risks associated with these tasks.
4/
-
I would need to see a transparent attempt to compensate the "humans in the loop" with salaries commensurate to the tasks that they are asked to perform, as well as benefits for any mental health strain or other risks associated with these tasks.
4/
Explicit regulation should prevent for profit companies proliferating their tools into the educational sector without any form of oversight.
5/
-
Explicit regulation should prevent for profit companies proliferating their tools into the educational sector without any form of oversight.
5/
And one would hope that the tech companies that facilitate the generation of CSAM would be extremely eager to discourage, prevent, or disallow the creation of said content.
In the case of some companies, they're not only being passive about this, they are actively encouraging it.
6/
-
And one would hope that the tech companies that facilitate the generation of CSAM would be extremely eager to discourage, prevent, or disallow the creation of said content.
In the case of some companies, they're not only being passive about this, they are actively encouraging it.
6/
...existing in a world where tech companies have access to my content (or any content) and can swallow it up wholesale without explicit consent is utterly demoralizing.
7/
-
...existing in a world where tech companies have access to my content (or any content) and can swallow it up wholesale without explicit consent is utterly demoralizing.
7/
For the ones that think LLM tools will allow people to be more creative, as they have more power and resources at their disposal—I'd say just take a look at how it's being used now.
Create bland essays. Answer bland emails. Write bland README docs. Produce bland code.
There are no sharp edges.
8/
-
For the ones that think LLM tools will allow people to be more creative, as they have more power and resources at their disposal—I'd say just take a look at how it's being used now.
Create bland essays. Answer bland emails. Write bland README docs. Produce bland code.
There are no sharp edges.
8/
What about you?
What is the worst thing that could possibly convince you that buying into LLM-usage is perhaps not all it's cracked up to be?
What if it deletes all your email? Or your hard drive?
Is it a self-serving metric, like how expensive is the privilege of using a tool? If the companies charged you $500/month or $1000 a month, would you say no?
Are there other factors that might sway you outside of how useful (or useless) it is to you?
You don't have to tell me... It's very unlikely that you'll change your own mind.
9/
-
What about you?
What is the worst thing that could possibly convince you that buying into LLM-usage is perhaps not all it's cracked up to be?
What if it deletes all your email? Or your hard drive?
Is it a self-serving metric, like how expensive is the privilege of using a tool? If the companies charged you $500/month or $1000 a month, would you say no?
Are there other factors that might sway you outside of how useful (or useless) it is to you?
You don't have to tell me... It's very unlikely that you'll change your own mind.
9/
I'm not swayed by an ethics that values an unknown future, no matter the cost. Technologists make poor visionaries. Their prophecy is not one I believe in, nor one that I would ever want fulfilled.
10/end
-
Recently, I mused over what it would take, from my perspective, to significantly change my view that the tech industry's infatuation with non-intelligent "intelligence" is a net-negative for society.
What Would It Take
An exercise to see what it would take to change my mind about AI.
Python By Night (pythonbynight.com)
Below are a few choice quotes from my post. (a sort of TLDR)
1/
@pythonbynight This was worth the read.
Thank you.
-
For the ones that think LLM tools will allow people to be more creative, as they have more power and resources at their disposal—I'd say just take a look at how it's being used now.
Create bland essays. Answer bland emails. Write bland README docs. Produce bland code.
There are no sharp edges.
8/
@pythonbynight Very much this, I've been referring to our current society as a cyberbland dystopia. I think the blandness of it all is also why there's less resistance to it than I feel there should be, because it doesn't feel offensive enough, or rather, the offensiveness is covered up by just how bland and boring it all is.
-
For the ones that think LLM tools will allow people to be more creative, as they have more power and resources at their disposal—I'd say just take a look at how it's being used now.
Create bland essays. Answer bland emails. Write bland README docs. Produce bland code.
There are no sharp edges.
8/
@pythonbynight they mean, instead of typing "man putting his palm on his face stock picture" into Google they'd type "ChatGPT, please generate a picture of a man putting his palm on his face" so they can add their personal touch to their PowerPoints. See? Creativity!
-
R relay@relay.infosec.exchange shared this topic