this post was submitted on 03 Dec 2025
737 points (98.3% liked)

Technology

77090 readers
1864 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] 87Six@lemmy.zip 35 points 2 days ago (21 children)

Kinda wrong to say "without permission". The user can choose whether the AI can run commands on its own or ask first.

Still, REALLY BAD, but the title doesn't need to make it worse. It's already horrible.

[–] utopiah@lemmy.world 9 points 2 days ago* (last edited 2 days ago) (7 children)

The user can choose whether the AI can run commands on its own or ask first.

That implies the user understands every single code with every single parameters. That's impossible even for experience programmers, here is an example :

rm *filename

versus

rm * filename

where a single character makes the entire difference between deleting all files ending up with filename rather than all files in the current directory and also the file named filename.

Of course here you will spot it because you've been primed for it. In a normal workflow, with pressure, then it's totally different.

Also IMHO more importantly if you watch the video ~7min the clarified the expected the "agent" to stick to the project directory, not to be able to go "out" of it. They were obviously painfully wrong but it would have been a reasonable assumption.

[–] nutsack@lemmy.dbzer0.com 1 points 1 day ago* (last edited 1 day ago) (1 children)

That implies the user understands every single code with every single parameters.

why not? you can even ask the ai if you don't know

There's no guarantee that it will tell you the truth. It could tell you to use Elmer's glue to keep the cheese from falling off your pizza. The AI doesn't "know" or "understand," it just does as its training set informed it to. It's just a very complex predictive text that you can give commands to.

load more comments (5 replies)
load more comments (18 replies)