Yes, it’s different this time:
In the past, people have often used tools from calculators to GPS systems for a kind of task-specific “cognitive offloading,” strategically delegating some jobs to reliable automated algorithms while using their own internal reasoning to oversee and evaluate the results. But the researchers argue that AI systems have given rise to a categorically different form of “cognitive surrender” in which users provide “minimal internal engagement” and accept an AI’s reasoning wholesale without oversight or verification. This “uncritical abdication of reasoning itself” is particularly common when an LLM’s output is “delivered fluently, confidently, or with minimal friction,” they point out.
Many (most?) people have always relied on somebody or something else to do the thinking for them.
AI can just be added to the long, long list of sources people so inclined uncritically accept.
Personally I surrender my cognitive load to the comments section under articles, so thank you for putting me at ease about this situation!
I feel called out
AI doesn’t object, or ask for compensation, and, it sits in their pocket, available 24/7, and the responses it gives are sycophantic. There’s a difference between relying on humans to do their thinking, and relying on a computer.
Fox News has some serious competition
Amen!
What’s this from?





