- cross-posted to:
- funny@sh.itjust.works
- funny@lemmy.ml
- cross-posted to:
- funny@sh.itjust.works
- funny@lemmy.ml
cross-posted from: https://lemmy.sdf.org/post/40225122
A bloke at the pub told me
“So I was talking to god the other day, and he told me…” is about the level of crazy we are dealing with.
Let me show you guys something worse:
I always placed it a little more like “well aunt Marge said”. I mean, you all know Aunt Marge. If she doesn’t know she’ll make something up on the spot just to sound smart. But, she also might actually know it or have been told it recently by somebody else. That said, you can’t fucking trust her without looking it up, so never just ask marge, go look it up yourself.
Ive been using it a lot lately to compare specs on consumer items. If you cant for the life of you figure out why 2 tvs are different prices without doing a heap of AV nerd research ask it to compare the model numbers and give you a breakdown of the differences.
I dont ask it for anything that could be a subjective opinion.
Not an overall bad fit, But you do have to be careful because if for some reason it can’t find any stats on one model over another It may very well just give you its best guess. Unless it’s doing a straight web search it seems to have about a 50/50 chance of knowing when it doesn’t know.
That usually means the manufacturer of the cheaper set got a better deal on bulk orders.
“I checked my horoscope” sounds a bit more accurate, but not my much…
“I asked
Chat GPTmy psychic”Is this much different from, “I googled it” which seems to be a little more acceptable?
yes absolutely, when you google things you have to look at sources to find the information you need. it’s easy to notice if a piece of information is consistent across various sources, or if it’s just one guy saying something.
and AI cuts all of that off, even an AI “overview” can be harmful as people assume that the overview has the same information as the sources… when it can just be made up. AI answers don’t give you an easy way to double check the information its giving you. is the AI’s opinion a regurgitated opinion of the scientific majority? or is it a regurgitated opinion of John McBadOpinion?
Within GPT it does provide direct links to sources, but when it’s embedded outside of the native platform, maybe not. I mostly use it for finding hard to locate items, so I’m frequently following the links and they’re usually correct. That being said, yeah, if you’re using it for anything “serious” double and triple check info.
a singular source isn’t worth much, again is that source mirroring a common opinion of the scientific world, or is it just some guy saying something?
It’s usually multiple sources. The number varies, but generally between 3-20 depending on the subject matter. There are different settings you can change depending on whether you want to be in “research” mode. You still need to be mindful, but the links are there in the native platform most of the time.
hmm, fair enough. though i still feel like being given an answer (however correct) on a platter would make me less likely to double check, and when i google things on my own the double checking happens as i look for the answer (because i myself have to look through various sources)
Chat GPT is like having a friend who kisses and tells…ask it anything you ever wondered about people and places and things! It will give you what it regurgitates about the literature that was fed to it during training.
I see it as the same as ‘I asked a random demented person’
ChatGPT is useful for casual discussion facts. As seen as anything remotely important leans on it, it shouldn’t be trusted any more than a guy down the pub.
ChatGPT serves sources with your answer. It’s incredibly useful to pull up and then verify the information.
How many people do verify the information ?
Also, last time I did check, the source actually was some text hallucinated by a LLM even though I had specifically asked it to only use scientific sources.
I only found out because I also checked the sources quoted in this text and found out said sources simply didn’t exist.🤷♂️ how many do it writing their own shit or checking Wikipedia. All of my friends and peers check ChatGPT far more than regular search or wiki precisely because it is more prone to mistakes. Doesn’t mean it’s not the quickest way to track down the correct answer. None of this is new. This isn’t going away and that’s not how the majority of users with access feel.