basiclemmon98@lemmy.dbzer0.com to Not The Onion@lemmy.worldEnglish · 2 months agoAfter using ChatGPT, man swaps his salt for sodium bromide—and suffers psychosisarstechnica.comexternal-linkmessage-square48linkfedilinkarrow-up1376arrow-down11file-textcross-posted to: technology@beehaw.orgworldnews@aussie.zonefuck_ai@lemmy.worldpublichealth@mander.xyzfuck_ai@lemmy.world
arrow-up1375arrow-down1external-linkAfter using ChatGPT, man swaps his salt for sodium bromide—and suffers psychosisarstechnica.combasiclemmon98@lemmy.dbzer0.com to Not The Onion@lemmy.worldEnglish · 2 months agomessage-square48linkfedilinkfile-textcross-posted to: technology@beehaw.orgworldnews@aussie.zonefuck_ai@lemmy.worldpublichealth@mander.xyzfuck_ai@lemmy.world
minus-squareprole@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up9arrow-down1·2 months agoYeah, people like to mock us about it, but I think it’s a reasonable regulation.
minus-squareTuukka R@sopuli.xyzlinkfedilinkEnglisharrow-up4·2 months ago Yeah, people like to mock us about it, but I think it’s a reasonable regulation. It should apparently be amended, though. There is a known case that it accidentally forbids but should not forbid.
minus-squareTuukka R@sopuli.xyzlinkfedilinkEnglisharrow-up0arrow-down1·2 months agodeleted by creator
Yeah, people like to mock us about it, but I think it’s a reasonable regulation.
It should apparently be amended, though. There is a known case that it accidentally forbids but should not forbid.
deleted by creator