this post was submitted on 25 Nov 2025
508 points (93.9% liked)
Technology
77072 readers
2827 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Our company rolled out a new and innovative internal LLM. It's intended to help with coding tasks and find internal documentation.
(Which really is just a wrapper over copilot).
When looking for documentation, it fails harder than pasting the same text into the sesrch bar.
I don't find LLMs helpful for coding since they're wrong so often. But after encouragement I decided to try my hand at using it to help me debug a small 12 line bash script.
Whenever I posted examples it would fail silently. I assumed it wasnt sanitizing inputs so I tried a few other methods of wrapping the text. It would still faik silently.
Eventually, a half hour later I decided to just Google it. After I resolved my own problem it messaged me, 45+ minutes later to say "an error occurred trying to handle your request".
These things suck so bad they can't even error out effectively.
and as many kWh…