this post was submitted on 15 Jul 2025
351 points (93.8% liked)

Fediverse

35431 readers
829 users here now

A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).

If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!

Rules

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration)

founded 2 years ago
MODERATORS
 

When I tried it in the past, I kinda didn't take it seriously because everything was confined to its instance, but now, there's full-featured global search and proper federation everywhere? Wow, I thought I heard there were some technical obstacles making it very unlikely, but now it's just there and works great! I asked ChatGPT and it says this feature was added 5 years ago! Really? I'm not sure how I didn't notice this sooner. Was it really there for so long? With flairs showing original instance where video comes from and everything?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] nulluser@lemmy.world 12 points 1 day ago* (last edited 1 day ago) (1 children)

It depends on what info you're trying to find.

I was recently trying to figure out the name of a particular uncommon type of pipe fitting. I could describe what it looked like, but had no idea what it was called. I described it to chatgpt, which gave me a name, which I could then search for with a normal search engine to confirm that the name was correct. Sure enough, search results took me to plumbing supply companies selling it, with pictures that matched what I described.

But, asking it when a particular feature got added to a piece of software? There's no additional information one would get from the answer to help them confirm that the answer is correct.

ETA: The above strategy has also failed me many times, though, where chatgpt gives me information that follow-up searches only confirmed that chatgpt hallucinated the answer. Just wanted to say that to reinforce that you have to assume it's hallucinating until you get independent confirmation.

[โ€“] Ulrich@feddit.org 3 points 1 day ago* (last edited 1 day ago)

You should use something like perplexity instead that actually provides links to where it found the information. It will still make shit up but at least it's easier to tell when it is.