• Cyrus Draegur@lemmy.zip
    link
    fedilink
    English
    arrow-up
    48
    arrow-down
    1
    ·
    1 month ago

    They’re still lying. For fucks sake. It’s like they impaled you on a pike and just admitted “okay so we did prick you with that needle”.

    ALL IT DOES IS HALLUCINATE. ALL IT DOES IS HALLUCINATE. ALL IT DOES IS HALLUCINATE. ALL IT DOES IS HALLUCINATE!

    SOMETIMES the hallucinations happen to resemble reality. Just because a hallucination happens to look similar to reality does not make it real.

    IT IS NOT PERCEIVING REALITY.

    EVER!

    EVER!

    Ever.

    • eatCasserole@lemmy.worldM
      link
      fedilink
      arrow-up
      5
      ·
      1 month ago

      It’s unfortunate that the word “hallucination” even got associated with LLMs in the first place… Hallucination refers to an erroneous perception. These chatbots don’t even have the capacity to perceive, let alone make errors. They have no senses, no awareness, no intentions, nothing. It’s just an inanimate machine crunching through a complex formula. Any resemblance to reality is purely coincidental.