mozz@mbin.grits.dev to Technology@beehaw.org · 1 year agoSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devexternal-linkmessage-square82linkfedilinkarrow-up1217arrow-down10file-text
arrow-up1217arrow-down1external-linkSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz@mbin.grits.dev to Technology@beehaw.org · 1 year agomessage-square82linkfedilinkfile-text
minus-squareIcalasari@fedia.iolinkfedilinkarrow-up1·1 year agoOr they aren’t paid enough to care and rightly figure their boss is a moron
minus-squarePup Biru@aussie.zonelinkfedilinkarrow-up5·edit-21 year agoanyone who works for a company setup to craft prompts like this doesn’t get to use the (invalid) “just following orders” defence
minus-squareIcalasari@fedia.iolinkfedilinkarrow-up4·edit-21 year agoOh I wasn’t saying that I was saying the person may not be stupid, and may figure their boss is a moron (the prompts don’t work as LLM chat bots don’t grasp negatives in their prompts very well)
Or they aren’t paid enough to care and rightly figure their boss is a moron
anyone who works for a company setup to craft prompts like this doesn’t get to use the (invalid) “just following orders” defence
Oh I wasn’t saying that
I was saying the person may not be stupid, and may figure their boss is a moron (the prompts don’t work as LLM chat bots don’t grasp negatives in their prompts very well)