this post was submitted on 10 Jul 2025
252 points (94.4% liked)

Technology

72669 readers
3341 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A robot trained on videos of surgeries performed a lengthy phase of a gallbladder removal without human help. The robot operated for the first time on a lifelike patient, and during the operation, responded to and learned from voice commands from the team—like a novice surgeon working with a mentor.

The robot performed unflappably across trials and with the expertise of a skilled human surgeon, even during unexpected scenarios typical in real life medical emergencies.

you are viewing a single comment's thread
view the rest of the comments
[–] DeathByBigSad@sh.itjust.works 81 points 1 day ago (3 children)

Good, now add jailtime for the ceo if something goes wrong, then we'll have a very safe tech.

[–] GreenKnight23@lemmy.world 10 points 1 day ago (1 children)

know what? let's just skip the middleman and have the CEO undergo the same operation. you know like the taser company that tasers their employees.

can't have trust in a product unless you use the product.

[–] cactusupyourbutt@lemmy.world 5 points 18 hours ago (1 children)

I understand what you are saying is intended as „if they trust their product they should use it themselves“ and I agree with that

I do think that undergoing an operation that a person doesnt need isnt ethical however

[–] GreenKnight23@lemmy.world 1 points 18 hours ago

who said they won't need it 😐

[–] rottingleaf@lemmy.world 3 points 1 day ago

Nah, just a thorough reproduction of the consequences of that wrong.

[–] njordomir@lemmy.world 1 points 16 hours ago

Inb4 someone added Texas Chainsaw Massacre and Saw to the training data.