this post was submitted on 29 Sep 2025
164 points (85.0% liked)

Technology

76609 readers
3124 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

...In Geekbench 6.5 single-core, the X2 Elite Extreme posts a score of 4,080, edging out Apple’s M4 (3,872) and leaving AMD’s Ryzen AI 9 HX 370 (2,881) and Intel’s Core Ultra 9 288V (2,919) far behind...

...The multi-core story is even more dramatic. With a Geekbench 6.5 multi-core score of 23,491, the X2 Elite Extreme nearly doubles the Intel Core Ultra 9 185H (11,386) and comfortably outpaces Apple’s M4 (15,146) and AMD’s Ryzen AI 9 370 (15,443)...

...This isn’t just a speed play — Qualcomm is betting that its ARM-based design can deliver desktop-class performance at mobile-class power draw, enabling thin, fanless designs or ultra-light laptops with battery life measured in days, not hours.

One of the more intriguing aspects of the Snapdragon X2 Elite Extreme is its memory‑in‑package design, a departure from the off‑package RAM used in other X2 Elite variants. Qualcomm is using a System‑in‑Package (SiP) approach here, integrating the RAM directly alongside the CPU, GPU, and NPU on the same substrate.

This proximity slashes latency and boosts bandwidth — up to 228 GB/s compared to 152 GB/s on the off‑package models — while also enabling a unified memory architecture similar in concept to Apple’s M‑series chips, where CPU and GPU share the same pool for faster, more efficient data access...

... the company notes the "first half" of 2026 for the new Snapdragon X2 Elite and Snapdragon X2 Elite Extreme...

top 50 comments
sorted by: hot top controversial new old
[–] Alphane_Moon@lemmy.world 195 points 1 month ago* (last edited 1 month ago) (5 children)

Keep in mind the original X Elite benchmarks were never replicated in real world devices (not even close).

They used a desktop style device (with intense cooling that is not possible with laptops) and "developed solely for benchmarking" version of Linux (to this day X Elite runs like shit in Linux).

This is almost certainly a premeditated attempt at "legal false advertising".

Mark my words, you'll never see 4,000 points in GB6 ST on any real products.

[–] boonhet@sopuli.xyz 65 points 1 month ago* (last edited 1 month ago) (3 children)

They also used the base M4, not M4 Pro or Max

[–] CmdrShepard49@sh.itjust.works 35 points 1 month ago (1 children)

Seems like they're also using two different Intel chips in their testing for some reason.

[–] circuitfarmer@lemmy.sdf.org 24 points 1 month ago

I'll take cherrypicking for $500, Alex

[–] Ugurcan@lemmy.world 5 points 1 month ago (1 children)
[–] boonhet@sopuli.xyz 3 points 1 month ago

lol that’s just the cherry on the whole apple pie.

[–] Reverendender@sh.itjust.works 3 points 1 month ago

Now this all makes sense

[–] Zak@lemmy.world 21 points 1 month ago (1 children)

I imagine things would be much closer if they put a giant heatsink that Ryzen 370 they're comparing and ran it at its 54W configurable TDP instead of the default 28W.

[–] pycorax@sh.itjust.works 6 points 1 month ago

Shouldn't they also be comparing it to Strix Halo instead?

[–] tal@olio.cafe 12 points 1 month ago

Ah. Thanks for the context.

Well, after they have product out, third parties will benchmark them, and we'll see how they actually stack up.

[–] SharkAttak@kbin.melroy.org 6 points 1 month ago

I saw someone liquid cool an Arduino to push it to the max, but you couldn't declare it to be a regular benchmark...

[–] itztalal@lemmings.world 5 points 1 month ago

desktop-class performance at mobile-class power draw

This made my bullshit detector go haywire.

[–] Buffalox@lemmy.world 69 points 1 month ago (4 children)

Snapdragon X2 Elite Extreme

That doesn't sound very high end, I think I'll wait for the Pro version, preferably Pro Plus.

[–] Evil_Shrubbery@thelemmy.club 29 points 1 month ago (2 children)

BadDragon X2 Elite Extreme MAGNUM

[–] Valmond@lemmy.world 3 points 1 month ago (1 children)
[–] Evil_Shrubbery@thelemmy.club 2 points 1 month ago

The Raw Rare version ( ͡° ͜ʖ ͡°)

[–] ICastFist@programming.dev 2 points 1 month ago

That one will go hard

[–] PalmTreeIsBestTree@lemmy.world 17 points 1 month ago (1 children)

It sounds like an advertisement for a condom or dildo

[–] mannycalavera@feddit.uk 7 points 1 month ago

Don't you want to put on some of this thermal paste?

Where this is going, baby, you don't need no thermal paste!

faints on floor

[–] zaphod@sopuli.xyz 6 points 1 month ago

Elite Extreme

Sounds like it focuses more on shiny RGB than performance.

[–] prettybunnys@sh.itjust.works 2 points 1 month ago

The ultra absorbent one is the one to get

[–] a_fancy_kiwi@lemmy.world 65 points 1 month ago* (last edited 1 month ago) (1 children)

Let me know when these X elite chips have full Linux compatibility and then I’ll be interested. Until then, I’ll stick with Mac, it has the better hardware.

load more comments (1 replies)
[–] just_another_person@lemmy.world 49 points 1 month ago* (last edited 1 month ago) (1 children)

I'm going to call semi-bullshit here, or there is a major revisionist version or catch. If this were true, they'd be STUPID to not be working fast as hell to get full, unlocked Linux support upstreamed and start selling this as a datacenter competitor to what Amazon, Microsoft, and Amazon are offering, because it would be an entirely new class of performance. It could also dig into Nvidia and AMDs datacenter sales at scale if this efficient.

[–] boonhet@sopuli.xyz 22 points 1 month ago* (last edited 1 month ago)

They put desktop cooling on the testbench apparently.

They’re also comparing to only the base M4 chip, not the Pro.

Also the M5 could still come out this year. But it also might not so it’s still a fair comparison till then.

Anyway if you’re looking for a Windows laptop specifically and don’t need anything that doesn’t run on ARM, it might be pretty damn good. I’d still wait for independent benchmarks.

[–] the_q@lemmy.zip 41 points 1 month ago (1 children)

Yeah I'll wait for independent benchmarks, thanks.

[–] Damage@feddit.it 17 points 1 month ago

With actual devices

[–] artyom@piefed.social 31 points 1 month ago

This will be super cool when we actually have OSs that can run on them!

[–] TheGrandNagus@lemmy.world 16 points 1 month ago

The X1 Elite never lived up to its geekbench scores, and the drivers are absolute dogshit.

The X2 Elite wont match Apple or AMD in real world scenarios either, I'd wager.

[–] YurkshireLad@lemmy.ca 16 points 1 month ago

Windows 11 will turn this into a 486.

[–] malwieder@feddit.org 15 points 1 month ago

X2 "Elite Extreme" probably in ideal conditions vs. the base M4 chip in a real-world device. Sure, nice single core results but Apple will likely counter with the M5 (the A19 Pro already reaches around 4,000 and the M chips can probably clock a bit higher). And the M4 Pro and Max already score as high or higher in multi-core. Real world in a 14 inch laptop.

It doesn't "crush" the M4 series at all and we'll see how it'll perform in a comparable power/thermal envelope.

I don't hate what Qualcomm is doing here, but these chips only work properly under Windows and the Windows app ecosystem still hasn't embraced ARM all that much, and from what I've heard Windows' x64 to ARM translation layer is not as good as Rosetta 2. Linux support is pretty horrible, especially at launch.

[–] JigglySackles@lemmy.world 15 points 1 month ago

I am simple person. I see geekbench, I ignore claims and rest of article.

[–] KiwiTB@lemmy.world 10 points 1 month ago

I highly doubt this is accurate. Be nice, but doubt it.

[–] verdi@feddit.org 9 points 1 month ago

*X Elite opens browser windows faster under desktop cooling.

FTFY

[–] commander@lemmy.world 9 points 1 month ago (3 children)

How's the GPU drivers though? Especially to me for Linux. These should be used in PC gaming handhelds but Qualcomm support is mediocre

If it's anything like their windows driver support then also awful. Maybe things have improved in the last year or so, but has Qualcomm ever put real effort into making ARM Windows laptops good?

[–] humanspiral@lemmy.ca 2 points 1 month ago (1 children)

linux on arm is not mature. on windows, typically emulation of x86 is used. They'll need to also support all of the gpu libraries for gaming.

[–] vaionko@sopuli.xyz 3 points 1 month ago (1 children)

Desktop linux on arm*. The kernel itself has been running on embedded arm deviced for 25 years and on a large portion of phones for 15.

[–] squaresinger@lemmy.world 2 points 1 month ago (4 children)

The question was about GPU drivers, and GPU drivers for ARM-based SoCs aren't even mature on Android. They are going to suck on Linux.

Compared to the drivers for Mali, Adreno and consorts, Nvidia is a bunch of saints, and we know how much Nvidia drivers suck under Linux.

load more comments (4 replies)
[–] squaresinger@lemmy.world 2 points 1 month ago

How's the GPU drivers though? Especially to me for Linux.

Not. The answer is not.

[–] VeloRama@feddit.org 5 points 1 month ago

Can't wait for Linux to support it and Tuxedo creating a laptop with it.

[–] fittedsyllabi@lemmy.world 5 points 1 month ago

Then Apple releases M5.

[–] flemtone@lemmy.world 4 points 1 month ago

When the Snapdragon GPU performance is on par with AMD's 780m or above then we can talk.

[–] itztalal@lemmings.world 4 points 1 month ago

desktop-class performance at mobile-class power draw

checks source

windowcentral.com

Nothing to see here, folks.

[–] Valmond@lemmy.world 3 points 1 month ago (3 children)

And here I am with my cheap old quad core doing my stuff.

Except for the theoretical interest, what are we supposed to do with stuff like that? Is it just more data centers? Does I sound like 640KB is enough?

load more comments (3 replies)
[–] ABetterTomorrow@sh.itjust.works 2 points 1 month ago

Oh no, each new chip is going to be tree at something than another chip and vice versa. Anyways, what did people have for lunch?

[–] MuskyMelon@lemmy.world 2 points 1 month ago

In my experience, arm64 is nowhere close to x64 with heavy multi processing/threading loads.

load more comments
view more: next ›