this post was submitted on 26 Jun 2025
128 points (95.7% liked)
Technology
72338 readers
2882 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Accuracy and hallucination are two ends of a spectrum.
If you turn hallucinations to a minimum, the LLM will faithfully reproduce what's in the training set, but the result will not fit the query very well.
The other option is to turn the so-called temperature up, which will result in replies fitting better to the query but also the hallucinations go up.
In the end it's a balance between getting responses that are closer to the dataset (factual) or closer to the query (creative).