this post was submitted on 06 Mar 2025
2 points (100.0% liked)

Privacy

35242 readers
166 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

Like, there's a lot of people freaking out about Apple ending End to End encryption in iCloud in UK. I'm just like: So What? It was probably backdoored from the beginning

So is Big Tech's E2E actually not backdoored? Or is that just a PR stunt to trick people into trusting iCloud, and this is a secret honeypot? 🤔

What are your thoughts?

top 5 comments
sorted by: hot top controversial new old
[–] Steve@communick.news 1 points 4 days ago

There is a difference between probably backdored, and we're not event trying to look secure anymore.

[–] AbouBenAdhem@lemmy.world 1 points 4 days ago

If they tell law enforcement they can’t produce an unencrypted copy and it’s later proven that they could, the potential penalty would likely be more severe than anything they could have gained by using the data themselves. And any employee (or third party they tried to sell the data to) could rat them out—so they’d have to keep the information within a circle too small to make use of it at scale. And even if it never leaked, hackers would eventually find and exploit the backdoor, exposing its existence. And in either case they’d also have to face lawsuits from shareholders (rightly) complaining that they were never warned of the legal risk.

[–] jqubed@lemmy.world 0 points 4 days ago (1 children)

I’m no expert but given the repeated efforts from governments around the world to get backdoors added to encryption and frequent pushback from big tech, or at least Apple, I’m more inclined to think there currently, or recently, aren’t backdoors. At least, not easy ones, not official ones. As an example, recall a few years ago there was a terror-related attack in the U.S. where someone tied to Muslim extremists went on a shooting spree before taking his own life (I’m not bothering to look up the details and my recollection could be flawed). The attacker used an iPhone and the U.S. government took the opportunity of strong public outrage to try to force Apple to create a tool to break the encryption on the iPhone so they could examine its contents. Apple resisted and the effort went to court, with the decision eventually being that Apple did not have to break the encryption. The government then revealed that they had access to a third party tool that they used to break into the phone and recover its contents. That’s pretty much been the pattern before and since: a government will try to find a cause that seems likely to gather widespread support and use that to get a backdoor they promise not to abuse, and the companies push back to varying degrees. All the while there seem to be third party tools that exploit various flaws, including zero-day flaws to gain the access the companies won’t provide. My impression is that at least a couple times a year there’s a story about an Apple security update patching these holes and notifying certain users if they may have been targeted.

It’s possible that’s all just theater put on by the U.S. and allies to help Apple or Google tell governments the U.S. doesn’t trust, “see, we can’t even give the U.S. government we’re subject to access, so we certainly can’t give you access.” Given some of the cases that have been used to try to force access, though, I’m more inclined to think the government really doesn’t have the easy access some might like.

Of course, it’s also possible that some of the flaws used by zero-day exploits to gain access are intentionally planted, either by the software companies or by an individual programmer acting at a government’s behest. The later patches could be to maintain appearances to outsiders, since there always seem to be additional flaws. Still, programming is hard enough and operating systems are complex enough that I’m more inclined to say that usually these really are just human error and not something malicious.

None of that is to say that anyone should fully trust these encryption systems. Used properly, they’re probably good enough against ordinary hackers, people just looking for financial rewards. You can keep your family photos, important records, school notes, etc. on them without worrying too much. Financial records you might want to doubly encrypt, just so they’re not so easy to exploit if there is a breach and data dump. If you’re doing something any government cares enough about to really investigate, they’re probably going to find a way into your computer, phone, or cloud service, depending on how motivated they are. Maybe not some impoverished “third-world” governments, but most of the big ones have some resources. I’d be extremely cautious about things that could actually send someone to jail, either in your own country or one that is less friendly.

[–] Lojcs@lemm.ee 0 points 4 days ago (1 children)

The government then revealed that they had access to a third party tool that they used to break into the phone and recover its contents.

I'm not sure if we're thinking of the same case but I also remember that the tool wasn't ready in the beginning, which is why they tried the court method until it was

[–] FauxLiving@lemmy.world 1 points 5 hours ago

It wasn't that it wasn't ready, it just cost them $1m to hire a private contractor to unlock it.

A court ordered redesign of the password lock timeout would have been free.