• eatCasserole@lemmy.worldM
    link
    fedilink
    arrow-up
    5
    ·
    1 month ago

    It’s unfortunate that the word “hallucination” even got associated with LLMs in the first place… Hallucination refers to an erroneous perception. These chatbots don’t even have the capacity to perceive, let alone make errors. They have no senses, no awareness, no intentions, nothing. It’s just an inanimate machine crunching through a complex formula. Any resemblance to reality is purely coincidental.