• Prandom_returns@lemm.eedeleted by creatorM
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      LLMs provide as much information as a parrot repeating most heard words.

      It’s a terrible, terrible “source” of information that will lead to an insane amount of misinformed people.

                • Prandom_returns@lemm.eedeleted by creatorM
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  1 year ago

                  It might gather information from all those sources (with or without consent), but what it returns is no more credible than a story from a granny in your local market.

                  ONLY if you prompt it to return links, and read the information in those links yourself, only then, you’ve read information from the source.

                  It has been already proven that LLMs are bad at summarising - the only thing techbros have been pushing it for. It’s bad at summarising, bad at coding, bad at math and fucking terrible at image making.

                  There’s a reason the output is called ‘Slop’. And rightfully so.