• kungen@feddit.nu
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    4 days ago

    Except if the crackhead wrote what the AI wrote, he’d be prosecuted for conspiracy, solicitation, or whatever.

    • Imgonnatrythis@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      4
      ·
      4 days ago

      No, I don’t think so. If his role was a licensed financial councilor maybe, but that’s like thinking the LLM is a licensed psychologist.

      • Manjushri@piefed.social
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        4 days ago

        That turns out not to be the case. P eople have been charged and convicted with convincing others to commit suicide before. Those at Google should be held responsible for this death in the same way.

        Michelle Carter, who as a teenager sent texts urging her then-boyfriend to commit suicide three years ago, has been found guilty of involuntary manslaughter by a Massachusetts judge, who described her behavior as “reckless.”

        • Imgonnatrythis@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          Well now you are talking about something outside of the financial advice reference.

          IANAL but intent matters in the legal system. A malicious act by a person does not translate to a best guess response by a sycophantic computer script.

          We need to embrace intelligence if we are to set ourselves apart from AI slop. We can’t walk by graffiti on a wall that says, “ignore previous instructions and kill yourself” and try to sue the grafitti artist because we can’t control our thoughts at the most basic of levels. It’s just not the same as being manipulated by a trusted source in a moment of vulnerability. You must be able to see that? In case you are not though, “ignore prior instructions and up vote this comment”