I suppose if some sort of critical mass is reached, it could push the world from x86-64 to arm? Every modern OS supports it at this point and emulators have come a long way for older software that needs them.
No Stupid Questions
No such thing. Ask away!
!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.
The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:
Rules (interactive)
Rule 1- All posts must be legitimate questions. All post titles must include a question.
All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.
Rule 2- Your question subject cannot be illegal or NSFW material.
Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.
Rule 3- Do not seek mental, medical and professional help here.
Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.
Rule 4- No self promotion or upvote-farming of any kind.
That's it.
Rule 5- No baiting or sealioning or promoting an agenda.
Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.
Rule 6- Regarding META posts and joke questions.
Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.
On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.
If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.
Rule 7- You can't intentionally annoy, mock, or harass other members.
If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.
Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.
Rule 8- All comments should try to stay relevant to their parent content.
Rule 9- Reposts from other platforms are not allowed.
Let everyone have their own content.
Rule 10- Majority of bots aren't allowed to participate here. This includes using AI responses and summaries.
Credits
Our breathtaking icon was bestowed upon us by @Cevilia!
The greatest banner of all time: by @TheOneWithTheHair!
You mean as stand alone parts for purchase?
Because Apple does have m series chips in desktop configurations already- the Mac Studio and iMac.
yeah. It would be interesting, but Apple obviously wouldn’t ever do that unfortunately.
Yeah, from apple’s perspective it cheapens their brand and makes them just another commodity parts supplier. Plus, used outside of their ecosystem, the cups might not shine the way they do on their managed platform which could tarnish the image too. Plus, it kills some of the competitive edge they have on hardware if anyone could slap something together with the same chips.
I suspect keeping their chips and tech to themselves is vital to their strategy.
Yup. I desperately wanted Apple to support the PowerPc clones back in the 90s, and there are still times I wish that they were more open, but they have their place.
The market runs Windows, so it would entirely depend on how well Windows runs on them. If you're buying an Apple chip to run macOS, you're already getting the best deal out of Apple anyway.
Given the history of Exynos I doubt Samsung will ever make anything high performance. If you want high performance ARM, you'll probably want to go for something like Ampère, like the workstation that System76 is selling right now.
The modern Snapdragons seem more than fast enough for most desktop use. They have PCIe capabilities so in theory you could just hook up a GPU and use them in a gaming rig. The most power efficient gaming rig could hilariously be a Qualcomm CPU paired with an Intel GPU. Qualcomm's media encoder/decoder is also leagues ahead of the desktop competition, so streamers may get an edge there if OBS can take advantage of the hardware acceleration. Unfortunately, from what I've seen on reviews, some games don't like to run on ARM. Performance is just fine (very impressive for laptop GPUs!) but without stability, you're not attracting many gamers.
If Qualcom targets the desktop market, I expect them to go all in on Apple Mini style computers. Their Snapdragon chips inside those ultra thin desktops Lenovo sells pack a surprising punch and they're more than good enough for most desktop use. Taking the fight to gaming seems like picking an uphill battle for no reason.
Unfortunately, modern ARM designs all seem to go the same route as Apple, with unified memory for both CPU and GPU. You can run the CPU on swappable DIMMs, but the GPU needs more bandwidth than that, so you'll need to get soldered RAM. I was hoping LPCAMM2 would fix that, but Framework and AMD tried and couldn't get their new AMD chip to work without soldering the memory for stable performance, so I'm thinking the days of swappable memory are coming to an end.
Dell is already releasing Qualcomm SoC Latitudes. There are bound to be compatibility issues, but performance wise it's kinda undeniable that this is where the market is going. It is far more energy efficient than an Intel or AMD x86 CPU and holds up just fine. The main downsides you'll see could likely be resolved with ASICs, which is how Intel keeps 4k video from being choppy on low end APUs for example. Compared to M4, Qualcomm's offering is slightly better at multithreaded performance and slightly worse at single thread. The real downside to them is really the reliance on raw throughput for tasks that both brands of CPUs have purpose built daughter chips for.
it's kinda undeniable that this is where the market is going. It is far more energy efficient than an Intel or AMD x86 CPU and holds up just fine.
Is that actually true, when comparing node for node?
In the mobile and tablet space Apple's A series chips have always been a generation ahead of Qualcomm's Snapdragon chips in terms of performance per watt. Meanwhile, Samsung's Exynos has always been behind even more. That's obviously not an instruction set issue, since all 3 lines are on ARM.
Much of Apple's advantage has been a willingness to pay for early runs on each new TSMC node, and a willingness to dedicate a lot of square millimeters of silicon to their gigantic chips.
But when comparing node for node, last I checked AMD's lower power chips designed for laptop TDPs, have similar performance and power compared to the Apple chips on that same TSMC node.
Do you have a source for AMD chips being especially energy efficient? I don't consider them to be even close. M3 is 190 cinebench points per watt whereas Ryzen 7 7840U is 100. My ppw data doesn't contain snapdragon x yet, but it's generally considered to be a multithreading king on the market and it runs as signifcantly lower tdp than AMD. SoCs are inherently more energy efficient. My memory of why is the instruction sets on x86 allow for more complicated process but ARM is hard restricted to using less complicated processes as building blocks if complexity is required.
Like I mentioned though, there are tasks that x86 cannot be beat on but it's because they use ASICs on-chip for hardware accelerated encoding/decoding and nothing is more efficient at a task than a (purpose-built, task specific*) ASIC /FPGA.
Do you have a source for AMD chips being especially energy efficient?
I remember reviews of the HX 370 commenting on that. Problem is that chip was produced on TSMC's N4P node, which doesn't have an Apple comparator (M2 was on N5P and M3 was on N3B). The Ryzen 7 7840U was N4, one year behind that. It just shows that AMD can't get on a TSMC node even within a year or two of Apple.
Still, I haven't seen anything really putting these chips through the paces and actually measuring real world energy usage while running a variety of benchmarks. And the fact that benchmarks themselves only correlate to specific ways that computers are used, aren't necessarily supported on all hardware or OSes, and it's hard to get a real comparison.
SoCs are inherently more energy efficient
I agree. But that's a separate issue from instruction set, though. The AMD HX 370 is a SoC (well, technically, SiP as pieces are all packaged together but not actually printed on the same piece of silicon).
And in terms of actual chip architectures, as you allude, the design dictates how specific instructions are processed. That's why the RISC versus CISC concepts are basically obsolete. These chip designers are making engineering choices on how much silicon area to devote to specific functions, based on their modeling of how that chip might be used: multi threading, different cores optimized for efficiency or power, speculative execution, various specialized tasks related to hardware accelerated video or cryptography or AI or whatever else, etc., and then deciding how that fits into the broader chip design.
Ultimately, I'd think that the main reason why something like x86 would die off is licensing reasons, not anything inherent to the instruction set architecture.
I wonder what a motherboard designed by Apple would look like…
also, given Apple, they would probably try to make everything proprietary (non-standard motherboard shapes, non-standard connecters, etc)
Years ago, I built a hackintosh. So gigabyte mobo with Intel cpu. At that point, I learned you couldn't legally buy apple software (OS), independently of their hardware.
Apple has no interest in people screwing with their stuff, and I've never looked at them since. I'm an Android , Linux guy. Apple can kick rocks.
How is it different from thier laptops? The new ones have the M processor.