657
Google's Agentic AI wipes user's entire HDD without permission in catastrophic failure
(www.tomshardware.com)
This is a most excellent place for technology news and articles.
I have no experience with this ide but I see on the posted log on Reddit that the LLM is talking about a "step 620" - like this is hundreds of queries away from the initial one? The context must have been massive, usually after this many subsequent queries they start to hallucinating hardly
I explain what I mean: those algorithms have no memory at all. Each request is made on a blank slate, so when you do a "conversation" with them, the chat program is actually including all the previous interactions (or a resume of them) plus all the relevant parts of the code, simulating a conversation with a human. So the user didn't just ask "can you clear the cache" but actually asked the result of 600 messages + kilobytes of generated code + "can you clear the cache", and this causes destructive hallucinations