this post was submitted on 28 Feb 2025
153 points (97.5% liked)

Technology

65819 readers
5194 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] WalnutLum@lemmy.ml 27 points 1 week ago (6 children)

I think most ML experts (that weren't being paid out the wazoo for saying otherwise) have been saying we're on the tail end of the LLM technology sigma curve. (Basically treating an LLM as a stochastic index, the actual measure of training algorithm quality is query accuracy per training datum)

Even with deepseek's methodology, you see smaller and smaller returns on training input.

[–] MDCCCLV@lemmy.ca 14 points 1 week ago (5 children)

At this point, it is useful for doing some specific things so the way to make it great is making it cheap and accessible. Being able to run it locally would be way more useful.

[–] dustyData@lemmy.world 4 points 1 week ago (1 children)

Sure, but then what would they do with their billions of dollars data center plugged into a nuclear power plant?

[–] WhatAmLemmy@lemmy.world 5 points 1 week ago* (last edited 1 week ago)

Can we skip the dog and pony show, and get straight to paying the orphan crushing machine directly?

load more comments (3 replies)
load more comments (3 replies)