this post was submitted on 05 Nov 2025
476 points (99.0% liked)

Technology

76635 readers
2460 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

"I've been saving for months to get the Corsair Dominator 64GB CL30 kit," one beleagured PC builder wrote on Reddit. "It was about $280 when I looked," said u/RaidriarT, "Fast forward today on PCPartPicker, they want $547 for the same kit? A nearly 100% increase in a couple months?"

you are viewing a single comment's thread
view the rest of the comments
[–] Kissaki@feddit.org 5 points 12 hours ago* (last edited 12 hours ago) (1 children)

I suspect RAM may become increasingly useful with the shift from pure chat LLM to connected agents, MCP, and catching results and data for scaling things like public Internet search and services.

When I think of database system server software, a lot of performance gains are from keeping used data in RAM. With the expanding of LLM systems and it's concerns, backing data, connective ness, and need for optimisation, a shift to caching and keeping in RAM seems to suggest itself. It's already wasteful/big and operates on a lot of data, so it seems plausible that would not be a small cache.

[–] brucethemoose@lemmy.world 1 points 3 hours ago

Yeah, exactly... In other words, 'general server buildout.'