this post was submitted on 08 Aug 2025
745 points (96.6% liked)

Technology

73954 readers
3533 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Or my favorite quote from the article

"I am going to have a complete and total mental breakdown. I am going to be institutionalized. They are going to put me in a padded room and I am going to write... code on the walls with my own feces," it said.

you are viewing a single comment's thread
view the rest of the comments
[–] cabillaud@lemmy.world 8 points 4 days ago (2 children)

Could an AI use another AI if it found it better for a given task?

[–] panda_abyss@lemmy.ca 4 points 4 days ago

Yes, and this is pretty common with tools like Aider — one LLM plays the architect, another writes the code.

Claude code now has sub agents which work the same way, but only use Claude models.

[–] jj4211@lemmy.world 3 points 4 days ago

The overall interface can, which leads to fun results.

Prompt for image generation then you have one model doing the text and a different model for image generation. The text pretends is generating an image but has no idea what that would be like and you can make the text and image interaction make no sense, or it will do it all on its own. Have it generate and image and then lie to it about the image it generated and watch it just completely show it has no idea what picture was ever shown, but all the while pretending it does without ever explaining that it's actually delegating the image. It just lies and says "I" am correcting that for you. Basically talking like an executive at a company, which helps explain why so many executives are true believers.

A common thing is for the ensemble to recognize mathy stuff and feed it to a math engine, perhaps after LLM techniques to normalize the math.