• einlander@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    8 days ago

    The problem I see with poisoning the data is the AI’s being trained for law enforcement hallucinating false facts used to arrest and convict people.

    • patatahooligan@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      8 days ago

      Law enforcement AI is a terrible idea and it doesn’t matter whether you feed it “false facts” or not. There’s enough bias in law enforcement that the data is essentially always poisoned.

    • limonfiesta@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 days ago

      They aren’t poisoning the data with disinformation.

      They’re poisoning it with accurate, but irrelevant information.

      For example, if a bot is crawling sites relating to computer programming, or weather, this tool might lure the crawler into pages related to animal facts, or human biology.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      8 days ago

      Law enforcement doesn’t convict anyone, that’s a judge’s job. If a LEO falsely arrests you, you can sue them, and it should be pretty open-and-shut if it’s due to AI hallucination. Enough of that and LEO will stop it.

      • Jarix@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 days ago

        More likely they will remove your ability to sue them if you are talking about the usa and many other countries