this post was submitted on 03 Aug 2025
521 points (93.6% liked)

Technology

73677 readers
4247 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] digger@lemmy.ca 203 points 3 days ago (8 children)

How much longer until the AI bubbles pops? I'm tired of this.

[–] wewbull@feddit.uk 39 points 2 days ago (1 children)

It's when the coffers of Microsoft, Amazon, Meta and investment banks dry up. All of them are losing billions every month but it's all driven by fewer than 10 companies. Nvidia is lapping up the money of course, but once the AI companies stop buying GPUs on crazy numbers it's going to be a rocky ride down.

[–] astanix@lemmy.world 7 points 2 days ago (4 children)

Is it like crypto where cpus were good and then gpus and then FPGAs then ASICs? Or is this different?

[–] wewbull@feddit.uk 16 points 2 days ago (1 children)

I think it's different. The fundamental operation of all these models is multiplying big matrices of numbers together. GPUs are already optimised for this. Crypto was trying to make the algorithm fit the GPU rather than it being a natural fit.

With FPGAs you take a 10x loss in clock speed but can have precisely the algorithm you want. ASICs then give you the clock speed back.

GPUs are already ASICS that implement the ideal operation for ML/AI, so FPGAs would be a backwards step.

load more comments (1 replies)
[–] cley_faye@lemmy.world 4 points 2 days ago (3 children)

It's probably different. The crypto bubble couldn't actually do much in the field of useful things.

Now, I'm saying that with a HUGE grain of salt, but there are decent application with LLM (let's not call that AI). Unfortunately, these usages are not really in the sight of any business putting tons of money into their "AI" offers.

I kinda hope we'll get better LLM hardware to operate privately, using ethically sourced models, because some stuff is really neat. But that's not the push they're going for for now. Fortunately, we can already sort of do that, although the source of many publicly available models is currently… not that great.

[–] KumaSudosa@feddit.dk 6 points 2 days ago

LLMs are absolutely amazing for a lot of things. I use it at work all the time to check code blocks or remembering syntax. It is NOT and should NOT be your main source of general information and we collectively have to realise how problematic and energy consuming they are.

load more comments (2 replies)
load more comments (2 replies)
[–] cley_faye@lemmy.world 19 points 2 days ago

We're still in the "IT'S GETTING BILLIONS IN INVESTMENTS" part. Can't wait for this to run out too.

[–] Defaced@lemmy.world 14 points 2 days ago (4 children)

Here's the thing, it kind of already has, the new AI push is related to smaller projects and AI agents like Claude Code and GitHub copilot integration. MCP's are also starting to pick up some steam as a way to refine prompt engineering. The basic AI "bubble" popped already, what we're seeing now is an odd arms race of smaller AI projects thanks to companies like Deepseek pushing the AI hosting costs so low that anyone can reasonably host and tweak their own LLMs without costing a fortune. It's really an interesting thing to watch, but honestly I don't think we're going to see the major gains that the tech industry is trying to push anytime soon. Take any claims of AGI and OpenAI "breakthroughs" with a mountain of salt, because they will do anything to keep the hype up and drive up their stock prices. Sam Altman is a con man and nothing more, don't believe what he says.

load more comments (4 replies)
load more comments (4 replies)
[–] DreamlandLividity@lemmy.world 92 points 3 days ago* (last edited 3 days ago) (10 children)

The worst part is that once again, proton is trying to convince its users that it's more secure than it really is. You have to wonder what else they are lying or deceiving about.

[–] hansolo@lemmy.today 84 points 3 days ago (6 children)

Both your take, and the author, seem to not understand how LLMs work. At all.

At some point, yes, an LLM model has to process clear text tokens. There's no getting around that. Anyone who creates an LLM that can process 30 billion parameters while encrypted will become an overnight billionaire from military contracts alone. If you want absolute privacy, process locally. Lumo has limitations, but goes farther than duck.ai at respecting privacy. Your threat model and equipment mean YOU make a decision for YOUR needs. This is an option. This is not trying to be one size fits all. You don't HAVE to use it. It's not being forced down your throat like Gemini or CoPilot.

And their LLM. - it's Mistral, OpenHands and OLMO, all open source. It's in their documentation. So this article is straight up lies about that. Like.... Did Google write this article? It's simply propaganda.

Also, Proton does have some circumstances where it lets you decrypt your own email locally. Otherwise it's basically impossible to search your email for text in the email body. They already had that as an option, and if users want AI assistants, that's obviously their bridge. But it's not a default setup. It's an option you have to set up. It's not for everyone. Some users want that. It's not forced on everyone. Chill TF out.

[–] DreamlandLividity@lemmy.world 15 points 3 days ago* (last edited 3 days ago) (3 children)

Their AI is not local, so adding it to your email means breaking e2ee. That's to some extent fine. You can make an informed decision about it.

But proton is not putting warning labels on this. They are trying to confuse people into thinking it is the same security as their e2ee mails. Just look at the "zero trust" bullshit on protons own page.

[–] jjlinux@lemmy.zip 38 points 3 days ago* (last edited 3 days ago)

Where does it say "zero trust" 'on Protons own page'? It does not say "zero-trust" anywhere, it says "zero-access". The data is encrypted at rest, so it is not e2ee. They never mention end-to-end encryption for Lumo, except for ghost mode, and they are talking about the chat once it's complete and you choose to leave it there to use later, not about the prompts you send in.

Zero-access encryption

Your chats are stored using our battle-tested zero-access encryption, so even we can’t read them, similar to other Proton services such as Proton MailProton Drive, and Proton Pass. Our encryption is open source and trusted by over 100 million people to secure their data.

Which means that they are not advertising anything they are not doing or cannot do.

By posting this disinformation all you're achieving is getting people to pedal back to all the shit services out there for "free" because many will start believing that privacy is way harder than it actually is so 'what's the point' or, even worse, no alternative will help me be more private so I might as well just stop trying.

[–] hansolo@lemmy.today 11 points 3 days ago (16 children)

My friend, I think the confusion stems from you thinking you have deep technical understanding on this, when everything you say demonstrates that you don't.

First off, you don't even know the terminology. A local LLM is one YOU run on YOUR machine.

Lumo apparently runs on Proton servers - where their email and docs all are as well. So I'm not sure what "Their AI is not local!" even means other than you don't know what LLMs do or what they actually are. Do you expect a 32B LLM that would use about a 32GB video card to all get downloaded and ran in a browser? Buddy....just...no.

Look, Proton can at any time MITM attack your email, or if you use them as a VPN, MITM VPN traffic if it feels like. Any VPN or secure email provider can actually do that. Mullvad can, Nord, take your pick. That's just a fact. Google's business model is to MITM attack your life, so we have the counterfactual already. So your threat model needs to include how much do you trust the entity handling your data not to do that, intentionally or letting others through negligence.

There is no such thing as e2ee LLMs. That's not how any of this works. Doing e2ee for the chats to get what you type into the LLM context window, letting the LLM process tokens the only way they can, getting you back your response, and getting it to not keep logs or data, is about as good as it gets for not having a local LLM - which, remember, means on YOUR machine. If that's unacceptable for you, then don't use it. But don't brandish your ignorance like you're some expert, and that everyone on earth needs to adhere to whatever "standards" you think up that seem ill-informed.

Also, clearly you aren't using Proton anyway because if you need to search the text of your emails, you have to process that locally, and you have to click through 2 separate warnings that tell you in all bold text "This breaks the e2ee! Are you REALLY sure you want to do this?" So your complaint about warnings is just a flag saying you don't actually know and are just guessing.

load more comments (16 replies)
[–] loudwhisper@infosec.pub 5 points 3 days ago

Scribe can be local, if that's what you are referring to.

They also have a specific section on it at https://proton.me/support/proton-scribe-writing-assistant#local-or-server

Also emails for the most part are not e2ee, they can't be because the other party is not using encryption. They use "zero-access" which is different. It means proton gets the email in clear text, encrypts it with your public PGP key, deletes the original, and sends it to you.

See https://proton.me/support/proton-mail-encryption-explained

The email is encrypted in transit using TLS. It is then unencrypted and re-encrypted (by us) for storage on our servers using zero-access encryption. Once zero-access encryption has been applied, no-one except you can access emails stored on our servers (including us). It is not end-to-end encrypted, however, and might be accessible to the sender’s email service.

load more comments (5 replies)
[–] ztwhixsemhwldvka@lemmy.world 10 points 3 days ago (2 children)
[–] DreamlandLividity@lemmy.world 5 points 3 days ago (1 children)

Yes, indeed. Even so, just because there is a workaround, we should not ignore the issue (governments descending into fascism).

load more comments (1 replies)
load more comments (1 replies)
load more comments (8 replies)
[–] brucethemoose@lemmy.world 51 points 2 days ago* (last edited 2 days ago) (1 children)

First of all...

Why does an email service need a chatbot, even for business? Is it an enhanced search over your emails or something? Like, what does it do that any old chatbot wouldn't?

EDIT: Apparently nothing. It's just a generic Open Web UI frontend with Proton branding, a no-logs (but not E2E) promise, and kinda old 12B-32B class models, possibly finetuned on Proton documentation (or maybe just a branded system prompt). But they don't use any kind of RAG as far as I can tell.

There are about a bajillion of these, and one could host the same thing inside docker in like 10 minutes.

...On the other hand, it has no access to email I think?

[–] WhyJiffie@sh.itjust.works 9 points 2 days ago (4 children)

Why does an email service need a chatbot, even for business?

they are not only an email service, for quite some time now

There are about a bajillion of these, and one could host the same thing inside docker in like 10 minutes.

sure, with a thousand or two dollars worth of equipment and then computer knowledge. Anyone could do it really. but even if not, why don't they just rawdog deepseek? I don't get it either

...On the other hand, it has no access to email I think?

that's right. you can upload files though, or select some from your proton drive, and can do web search.

[–] brucethemoose@lemmy.world 8 points 2 days ago* (last edited 2 days ago)

sure, with a thousand or two dollars worth of equipment and then computer knowledge. Anyone could do it really. but even if not, why don’t they just rawdog deepseek? I don’t get it either

What I mean is there are about 1000 different places to get 32B class models via Open Web UI with privacy guarantees.

With mail, vpn, (and some of their other services?) they have a great software stack and cross integration to differentiate them, but this is literally a carbon copy of any Open Web UI service… There is nothing different other than the color scheme and system prompt.

I’m not trying to sound condescending, but it really feels like a cloned “me too,” with the only value being the Proton brand and customer trust.

load more comments (3 replies)
[–] A_norny_mousse@feddit.org 24 points 3 days ago (5 children)

For a critical blog, the first few paragraphs sound a lot like they're shilling for Proton.

I'm not sure if I'm supposed to be impressed by the author's witty wording, but "the cool trick they do" is - full encryption.

Moving on.

But that’s misleading. The actual large language model is not open. The code for Proton’s bit of Lumo is not open source. The only open source bit that Proton’s made available is just some of Proton’s controls for the LLM. [GitHub]

In the single most damning thing I can say about Proton in 2025, the Proton GitHub repository has a “cursorrules” file. They’re vibe-coding their public systems. Much secure!

oof.

Over the years I've heard many people claim that proton's servers being in Switzerland is more secure than other EU countries - well there's also this now:

Proton is moving its servers out of Switzerland to another country in the EU they haven’t specified. The Lumo announcement is the first that Proton’s mentioned this.

No company is safe from enshittification - always look for, and base your choices on, the legally binding stuff, before you commit. Be wary of weasel wording. And always, always be ready to move* on when the enshittification starts despite your caution.


* regarding email, there's redirection services a.k.a. eternal email addresses - in some cases run by venerable non-profits.

[–] Tetsuo@jlai.lu 33 points 3 days ago (1 children)

Regarding the fact that proton stops hosting in Switzerland : I thought it was because of new laws in Switzerland and that they hzf not much of a choice ?

[–] DeathByBigSad@sh.itjust.works 8 points 2 days ago

The law isn't a law yet, its a just a proposal. Proton is still in Switzerland, but they said they're gonna move if the surveillance law actually becomes law.

[–] loudwhisper@infosec.pub 16 points 3 days ago

Over the years I've heard many people claim that proton's servers being in Switzerland is more secure than other EU countries

Things change. They are doing it because Switzerland is proposing legislation that would definitely make that claim untrue. Europe is no paradise, especially certain countries, but it still makes sense.

From the lumo announcement:

Lumo represents one of many investments Proton will be making before the end of the decade to ensure that Europe stays strong, independent, and technologically sovereign. Because of legal uncertainty around Swiss government proposals(new window) to introduce mass surveillance — proposals that have been outlawed in the EU — Proton is moving most of its physical infrastructure out of Switzerland. Lumo will be the first product to move.

This shift represents an investment of over €100 million into the EU proper. While we do not give up the fight for privacy in Switzerland (and will continue to fight proposals that we believe will be extremely damaging to the Swiss economy), Proton is also embracing Europe and helping to develop a sovereign EuroStack(new window) for the future of our home continent. Lumo is European, and proudly so, and here to serve everybody who cares about privacy and security worldwide.

[–] ItsComplicated@sh.itjust.works 14 points 3 days ago

Switzerland has a surveillance law in the works that will force VPNs, messaging apps, and online platforms to log users' identities, IP addresses, and metadata for government access

load more comments (2 replies)
[–] brucethemoose@lemmy.world 18 points 2 days ago* (last edited 2 days ago)

OK, so I just checked the page:

https://lumo.proton.me/guest

Looks like a generic Open Web UI instance, much like Qwen's: https://openwebui.com/

Based on this support page, they are using open models and possibly finetuning them:

https://proton.me/support/lumo-privacy

The models we’re using currently are Nemo, OpenHands 32B, OLMO 2 32B, and Mistral Small 3

But this information is hard to find, and they aren't particularly smart models, even for 32B-class ones.

Still... the author is incorrect, they specify how long requests are kept:

When you chat with Lumo, your questions are sent to our servers using TLS encryption. After Lumo processes your query and generates a response, the data is erased. The only record of the conversation is on your device if you’re using a Free or Plus plan. If you’re using Lumo as a Guest, your conversation is erased at the end of each session. Our no-logs policy ensures wekeep no logs of what you ask, or what Lumo replies. Your chats can’t be seen, shared, or used to profile you.

But it also mentions that, as is a necessity now, they are decrypted on the GPU servers for processing. Theoretically they could hack the input/output layers and the tokenizer into a pseudo E2E encryption scheme, but I haven't heard of anyone doing this yet... And it would probably be incompatible with their serving framework (likely vllm) without some crack CUDA and Rust engineers (as you'd need to scramble the text and tokenize/detokenize it uniquely for scrambled LLM outer layers for each request).

They are right about one thing: Proton all but advertise Luma as E2E when that is a lie. Per its usual protocol, Open Web UI will send the chat history for that particular chat to the server for each requests, where it is decoded and tokenized. If the GPU server were to be hacked, it could absolutely be logged and intercepted.

[–] Gaja0@lemmy.zip 16 points 3 days ago (2 children)

I'm just saying Andy sucking up to Trump is a red flag. I'm cancelling in 2026 🫠

load more comments (2 replies)
[–] badelf@lemmy.dbzer0.com 15 points 2 days ago (3 children)

Proton has my vote for fastest company ever to completely enshittify.

[–] EncryptKeeper@lemmy.world 18 points 2 days ago

How have they enshittified? I haven’t noticed anything about their service get worse since they started.

load more comments (2 replies)
[–] cley_faye@lemmy.world 14 points 2 days ago (3 children)

Any business putting "privacy first" thing that works only on their server, and requires full access to plaintext data to operate, should be seen as lying.

I've been annoyed by proton for a long while; they do (did?) provide a seemingly adequate service, but claims like "your mails are safe" when they obviously had to have them in plaintext on their server, even if only for compatibility with current standards, kept me away from them.

[–] EncryptKeeper@lemmy.world 11 points 2 days ago (6 children)

they obviously had to have them in plaintext on their server, even if only for compatibility with current standards

I don’t think that’s obvious at all. On the contrary, that’s a pretty bold claim to make, do you have any evidence that they’re doing this?

[–] DeathByBigSad@sh.itjust.works 6 points 2 days ago* (last edited 2 days ago) (2 children)

Incoming Emails that aren't from proton, or PGP encrypted (which are like 99% of emails), arrives at Proton Servers via TLS which they decrypt and then have the full plaintext. This is not some conspiracy, this is just how email works.

Now, Proton and various other "encrypted email" services then take that plaintext and encypt it with your public key, then store the ciphertext on their servers, and then they're supposed to discard the plaintext, so that in case of a future court order, they wouldn't have the plaintext anymore.

But you can't be certain if they are lying, since they do necessarily have to have access to the plaintext for email to function. So "we can't read your emails" comes with a huge asterisk, it onlu applies to those sent between Proton accounts or other PGP encrypted emails, your average bank statement and tax forms are all accessible by Proton (you're only relying on their promise to not read it).

[–] EncryptKeeper@lemmy.world 17 points 2 days ago* (last edited 2 days ago) (10 children)

Ok yeah thats a far cry from Proton actually “Having your unencrypted emails on their servers” as if they’re not encrypted at rest.

There’s the standard layer of trust you need to have in a third party when you’re not self hosting. Proton has proven so far that they do in fact encrypt your emails and haven’t given any up to authorities when ordered to so I’m not sure where the issue is. I thought they were caught not encrypting them or something.

load more comments (10 replies)
load more comments (1 replies)
load more comments (5 replies)
load more comments (2 replies)
[–] Harry_h0udini@lemmy.dbzer0.com 13 points 1 day ago (1 children)

Proton is shifting as mainstream company. AI craps, false misleading advertising.

[–] Red_October@lemmy.world 9 points 2 days ago (2 children)

Okay but are any AI chatbots really open source? Isn't half the headache with LLMs the fact that there comes a point where it's basically impossible for even the authors to decode the tangled madness of their machine learning?

[–] lefixxx@lemmy.world 9 points 1 day ago* (last edited 1 day ago) (1 children)

Yeah but you don't open source the LLM, you open source the training code and the weights and the specs/architecture

load more comments (1 replies)
[–] nymnympseudonym@lemmy.world 8 points 1 day ago* (last edited 1 day ago)

Yes, several are fully open source. I like Mistral

load more comments
view more: next ›