Nothing "went rogue".
-
RE: https://c.im/@cdarwin/116479704797697865
Nothing "went rogue". AI didn't delete the firm's database and backups. A human operator built admin automation and ran it in production without adequate testing or backups.
I'm sorry, but no: a human gave admin privileges to unverified tools and ran them in a production environment.
Own your work. You as sysadmin, developer, etc. are paid to perform a job with skill and diligence. Ultimately you are responsible for your professional work. If there was someone upstream responsible for V&V of the tool, ensuring users are trained, cautions and limitations of the tool are communicated, and confirming the tool is fit for its intended use, they bear a share of that responsibility.
If you're the manager that forced worker to use an unreliable tool on production systems without putting it through proper V&V, without effective user training, use case development, or risk assessment, you bear a share of the responsibility.
Repeating this for those in the back: AI does not launder away your job responsibilities.
-
RE: https://c.im/@cdarwin/116479704797697865
Nothing "went rogue". AI didn't delete the firm's database and backups. A human operator built admin automation and ran it in production without adequate testing or backups.
I'm sorry, but no: a human gave admin privileges to unverified tools and ran them in a production environment.
Own your work. You as sysadmin, developer, etc. are paid to perform a job with skill and diligence. Ultimately you are responsible for your professional work. If there was someone upstream responsible for V&V of the tool, ensuring users are trained, cautions and limitations of the tool are communicated, and confirming the tool is fit for its intended use, they bear a share of that responsibility.
If you're the manager that forced worker to use an unreliable tool on production systems without putting it through proper V&V, without effective user training, use case development, or risk assessment, you bear a share of the responsibility.
Repeating this for those in the back: AI does not launder away your job responsibilities.
@arclight i immediately assumed this was an excuse for an intelligence operation
-
RE: https://c.im/@cdarwin/116479704797697865
Nothing "went rogue". AI didn't delete the firm's database and backups. A human operator built admin automation and ran it in production without adequate testing or backups.
I'm sorry, but no: a human gave admin privileges to unverified tools and ran them in a production environment.
Own your work. You as sysadmin, developer, etc. are paid to perform a job with skill and diligence. Ultimately you are responsible for your professional work. If there was someone upstream responsible for V&V of the tool, ensuring users are trained, cautions and limitations of the tool are communicated, and confirming the tool is fit for its intended use, they bear a share of that responsibility.
If you're the manager that forced worker to use an unreliable tool on production systems without putting it through proper V&V, without effective user training, use case development, or risk assessment, you bear a share of the responsibility.
Repeating this for those in the back: AI does not launder away your job responsibilities.
100%
-
RE: https://c.im/@cdarwin/116479704797697865
Nothing "went rogue". AI didn't delete the firm's database and backups. A human operator built admin automation and ran it in production without adequate testing or backups.
I'm sorry, but no: a human gave admin privileges to unverified tools and ran them in a production environment.
Own your work. You as sysadmin, developer, etc. are paid to perform a job with skill and diligence. Ultimately you are responsible for your professional work. If there was someone upstream responsible for V&V of the tool, ensuring users are trained, cautions and limitations of the tool are communicated, and confirming the tool is fit for its intended use, they bear a share of that responsibility.
If you're the manager that forced worker to use an unreliable tool on production systems without putting it through proper V&V, without effective user training, use case development, or risk assessment, you bear a share of the responsibility.
Repeating this for those in the back: AI does not launder away your job responsibilities.
@arclight “own your work”
Preach! Preach this far and wide! Be proud of the actual things you create and be responsible for them. It’ll make your work better and you a better human being. -
RE: https://c.im/@cdarwin/116479704797697865
Nothing "went rogue". AI didn't delete the firm's database and backups. A human operator built admin automation and ran it in production without adequate testing or backups.
I'm sorry, but no: a human gave admin privileges to unverified tools and ran them in a production environment.
Own your work. You as sysadmin, developer, etc. are paid to perform a job with skill and diligence. Ultimately you are responsible for your professional work. If there was someone upstream responsible for V&V of the tool, ensuring users are trained, cautions and limitations of the tool are communicated, and confirming the tool is fit for its intended use, they bear a share of that responsibility.
If you're the manager that forced worker to use an unreliable tool on production systems without putting it through proper V&V, without effective user training, use case development, or risk assessment, you bear a share of the responsibility.
Repeating this for those in the back: AI does not launder away your job responsibilities.
@arclight according to the article, they gave an access token with full access to the tool, used the tool on staging and it had side-effect over prod.
The CEO seems to say it wasn't possible to have properly scoped access token, which seems weird and a big red flag if true -
RE: https://c.im/@cdarwin/116479704797697865
Nothing "went rogue". AI didn't delete the firm's database and backups. A human operator built admin automation and ran it in production without adequate testing or backups.
I'm sorry, but no: a human gave admin privileges to unverified tools and ran them in a production environment.
Own your work. You as sysadmin, developer, etc. are paid to perform a job with skill and diligence. Ultimately you are responsible for your professional work. If there was someone upstream responsible for V&V of the tool, ensuring users are trained, cautions and limitations of the tool are communicated, and confirming the tool is fit for its intended use, they bear a share of that responsibility.
If you're the manager that forced worker to use an unreliable tool on production systems without putting it through proper V&V, without effective user training, use case development, or risk assessment, you bear a share of the responsibility.
Repeating this for those in the back: AI does not launder away your job responsibilities.
@arclight shhhhh with your reasonable takes, the more they blame the LLM the less trustworthy all LLMs seem

-
@NewtonMark @arcadiagt5 @cdarwin "AI ate our runway"
-
@arclight shhhhh with your reasonable takes, the more they blame the LLM the less trustworthy all LLMs seem

@ajn142 Note the conspicuous absence of the "I made this (using a commercial chatbot reassembling other people's unattributed code)!" boosters stepping up to own this bad outcome. You can't have it both ways.
Chatbots are amazing and wonderful and effective except when they aren't; suddenly it's a "well, you scoped it wrong, didn't specify the right guardrails, didn't use the right please-do-not-incinerate-prod phrasing". All the typical alt-med faith healer victim blaming BS for when someone dies of cancer falling their mystical juice cleanse nonsense. You failed the System, the System didn't fail you. Except now the grifters are techbros who reject any notion of responsibility in addition to their rejection of consent. No victim-blaming just "LOL bro, we have your money -- sucks to be you!"
-
R relay@relay.mycrowd.ca shared this topic