• morrowind@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    3 days ago

    And how do you think it predicts that? All that complex math can be clustered into higher level structures. One could almost call it… thinking.

    Besides we have reasoning models now, so they can emulate thinking if nothing else

    • merc@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      3 days ago

      One could almost call it… thinking

      No, one couldn’t, unless one was trying to sell snake oil.

      so they can emulate thinking

      No, they can emulate generating text that looks like text typed up by someone who was thinking.

        • merc@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          2 days ago

          Yes, thinking involves signals firing in your brain. But, not just any signals. Fire the wrong signals and someone’s having a seizure not thinking.

          Just because LLMs generate words doesn’t mean they’re thinking. Thinking involves reasoning and considering something. It involves processing information, storing memories, then bringing them up later as appropriate. We know LLMs aren’t doing that because we know what they are doing, and what they’re doing is simply generating the next word based on previous words.