• TrackinDaKraken@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    10 hours ago

    Yes, it’s different this time:

    In the past, people have often used tools from calculators to GPS systems for a kind of task-specific “cognitive offloading,” strategically delegating some jobs to reliable automated algorithms while using their own internal reasoning to oversee and evaluate the results. But the researchers argue that AI systems have given rise to a categorically different form of “cognitive surrender” in which users provide “minimal internal engagement” and accept an AI’s reasoning wholesale without oversight or verification. This “uncritical abdication of reasoning itself” is particularly common when an LLM’s output is “delivered fluently, confidently, or with minimal friction,” they point out.

  • WatDabney@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    11 hours ago

    Many (most?) people have always relied on somebody or something else to do the thinking for them.

    AI can just be added to the long, long list of sources people so inclined uncritically accept.