this post was submitted on 08 Jul 2025
243 points (98.4% liked)

Technology

72745 readers
1399 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] CidVicious@sh.itjust.works 48 points 4 days ago (2 children)

Abandoned mine several years ago. Kind of a shame, they were a good option for a while for people who weren't windows fans but didn't want to run linux full time. Apple just doesn't really have any offerings for people who want a desktop that's upgradeable, but don't want to drop the money on a Mac Pro.

[–] thedeadwalking4242@lemmy.world 5 points 2 days ago (1 children)

I had a harder time getting a Macintosh going then just running Linux full time

[–] CidVicious@sh.itjust.works 4 points 2 days ago

I did not find it that difficult, but there was a lot of up front homework to make sure you had a compatible hardware configuration due to needing hardware which was supported by the limited number of Mac configurations. I recall running into a problem where I wasn't getting a picture on my monitor and I could not figure out why since my video card was supported and the drivers were ok. The problem as it turned out? I had my monitor connected via DVI and macs had never supported DVI so there were no drivers. Once the install was done, it pretty much Just Worked. Linux installs are pretty easy these days but debugging problems can be very difficult. The hard part of the hackintosh was keeping up with upgrades, since they needed to be done manually (due to potentially breaking things).

[–] maccentric@sh.itjust.works -5 points 4 days ago (2 children)

The SSD in the M4 mini is upgradable, for those who aren’t aware.

[–] Viper_NZ@lemmy.nz 20 points 3 days ago (2 children)

It’s replaceable, it’s not upgradable.

Apple doesn’t use standard NVMe M.2 drives. The controller is built into the SoC rather than being on the storage device itself.

[–] maccentric@sh.itjust.works 18 points 3 days ago (1 children)
[–] timetraveller@lemmy.world 3 points 3 days ago

Saving this for later.

[–] A_Random_Idiot@lemmy.world 10 points 2 days ago (3 children)

it never ceases to amaze me the amount of time, energy and money apple spends engineering things to be worse for customers.

[–] Samskara@sh.itjust.works 5 points 2 days ago

In this case Apple also prioritizes performance.

[–] Viper_NZ@lemmy.nz 1 points 2 days ago (1 children)

It’s more cost effective to integrate the controller.

Being worse for customers is just a happy accident.

[–] A_Random_Idiot@lemmy.world -1 points 2 days ago (1 children)

You and I both know that Apple doesnt do this shit for cost efficiency.

They do it to make make shit worse for consumers and "unauthorized" repair services.

[–] Viper_NZ@lemmy.nz 2 points 1 day ago

They’re a business. Reducing their costs (while charging you a premium) is absolutely what they do.

Apple’s whole deal for decades now has been building a vertical supply chain. Using their own SSD controller is one less component they have to pay others for.

They just don’t give a shit about downsides: aftermarket repairers or user upgradeability.

[–] jabjoe@feddit.uk -1 points 1 day ago* (last edited 1 day ago) (2 children)

Why? Anti-features aren't just Apple. All big tech do it to users.

Edit: And automotive, white goods companies, etc, etc

[–] A_Random_Idiot@lemmy.world 2 points 1 day ago (1 children)

other companies arent engineering serial numbers and other identity information into every component, even shit as small as halleffect sensors, so it cant be taken from a damaged device to repair a differnt device of the same make and model.

To act like what apple does is an industry standard is nothing but blatant apple fanboy propaganda.

[–] jabjoe@feddit.uk 1 points 1 day ago

Oh no, they are bastards. Extra big bastards in a sea of bastards. I blame regulators. The hope is the right to repair because law in more and more places in more and more market areas.

Without the EU regulators, Apple would never have gone USB C.

[–] squaresinger@lemmy.world 1 points 12 hours ago

There are some companies as bad as Apple (John Deere comes to mind), but it's certainly not the norm.

User-replacable standard m.2 SSDs are bog standard and non-standard formats are really rare. Apart from Apple I can not think of many companies that do that. IIRC Red Magic cameras, and Synology NAS but that's the only ones I can think of.

[–] Appoxo@lemmy.dbzer0.com 2 points 3 days ago (1 children)

Is this a take in regards to soldering in new flash chips or replacing a board and then needing to wrestle Apple support during an RMA to replace a faulty component (because I quiet confidently believe, Apple will cross check your hardware with their records from the serial number).

And I don't believe regular PC manufacturers/OEMs are that hard to argue with if I insert my own SSD.

[–] maccentric@sh.itjust.works 6 points 3 days ago* (last edited 3 days ago) (1 children)
[–] Appoxo@lemmy.dbzer0.com 1 points 3 days ago (1 children)

And how much does Apple like that?

[–] maccentric@sh.itjust.works 9 points 3 days ago (1 children)
[–] Appoxo@lemmy.dbzer0.com 1 points 3 days ago (1 children)

Doesnt work during an RMA.
After that: Yeah, sure.

[–] prettybunnys@sh.itjust.works 2 points 2 days ago

You could always just swap the original part back in if you need to do an RMA.