I honestly don't get why anyone would have bought an Intel in the last 3-4 years. AMD was just better on literally every metric.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
If your use case benefited from Quicksync then Intel was a clear choice.
Older Intel CPUs are the only ones that can play 4K BluRays on the player itself and not just ripping to a drive. Very niche use case but that is one I can think of.
They can't even do that anymore. sgx had a bunch of vulnerabilities and as a result, that service has been disabled.
https://sgx.fail/ SGX.Fail
Idle power is the only thing they are good at, but for a homeserver a used older cpu is good enough.
Was that even true for comparable CPU's? I feel this was only for their N100's etc.
Nah all the am4 cpus have abysmal idle power, the am5 got a little better as far as i know but the infinity fabric was a nightmare for the idle power.
Well I concede, I guess there was one metric they were better at. Doing absolutely nothing.
Looks like they didn't have adequate cooling for their CPU, killed it... Then replaced it without correcting the cooling. If your CPU hits 3 digits, it's not cooled properly.
If your CPU hits 3 digits, then throttling isn't working properly, because it should kick in before it hits that point.
The article (or one of the linked ones) says the max design temperature is 105°C, so it doesn't throttle until it hits that.
Which makes me think it should be able to sustain operating at that temperature. If not, Intel fucked up by speccing them too high.
I'd expect it to still throttle before getting to 105C, and then adjust to maintain a temp under 105C. If it goes above 105C, it should halt.
Then you misunderstand the spec. That's the max operating temperature, not the thermal protection limit. It throttles at 105 so it doesn't hit the limit at 115 or whatever and shut down. I can't find a detailed spec sheet that might give an exact figure.
The chip needs to account for thermal runaway, so I'd expect it to throttle before reaching max operating temperature and then adjust so it stays within that range. So it should downclock a little around 90C or whatever, the increase as needed as it approaches 105C or whatever the max operating temp is. If it goes above that temp, it should aggressively throttle or halt, depending how how far above it went and how quickly.
Why? It’s designed to run up to 105c.
I think it was when AMDs 7000 series CPUs were running at 95c and everyone freaked out that AMD came out and said that the CPUs are built to handle this load 24/7 365 for years on end.
And it’s not like this is new to Intel. Intel laptop CPUs have been doing this for a decade now.
CPUs should throttle as they approach the limit to prevent thermal runaway. As it gets closer to that limit, it should adjust the frequency in smaller increments until it arrives at that temp to keep the changes to temps small.
105c is the max operating temperature. It's not going to run away the second it hits 106.
Your CPU starts throttling at 104c so that way it almost never hits at 105c for long If it can't maintain clocks then it drops them until 104c can mostly be maintained.
If you have an improperly mounted cooler, you could very well get to 105C incredibly quickly, and 115C or whatever the halt temp is shortly after.
My intel mac's cpu (i5-5250U) throttles to maintain 105 C
That's not the case. 100% for new CPUs, but also for old ones too.
My father's old CPU cooler did not make good contact, got lose in one corner some how, and the system would throttle (fan at 100% making noise and PC run slow). After i fixed it, in one of my visits, CPU was working fine for years.
System throttles or even shuts down before any thermal damage occures (at least when temperatures rise normally).
Pretty much anything with a heat spreader should be impossible to accidentally kill. Bare die? May dog have mercy on your soul.
What if it hits around 90°C during Vulkan shader processing? 😅 Otherwise like 42–52 idle. How's that? I'm wondering if my cooling is sufficient.
This is an AMD 9950X3D + 9070 XT setup, for reference.
Any way to do Vulkan shader processing on the GPU perhaps, to speed it up?
It's fine, modern CPUs boost until they either hit amperage, voltage, or thermal constraints, assuming the motherboard isn't behaving badly then the upper limits for all of those are safe to be at perpetually.
If you're talking about the Steam feature you can safely turn it off, any modern hardware running mesa radv (the default AMD vulkan driver in most distros) should be sufficient to process shaders in real-time thanks to ACO.
I built a new PC recently. All I needed to see were the benchmarks over the last 5 years. There's currently no contest.
I went from Ryzen 1000 to intel 12000 since I need single threaded performance above all else (CAD). Plus it was a steal of a deal.
If Intel ever sorts out their drivers or it gets cheap enough I might for at 14000 chip but no further.
"Do you need to transcode video?
Then leave Intel the fuck alone."
Been my rule for 20 years, and it's worked good so far.
It’s odd, their GPUs are doing fine, a market they are young in, but their well established CPU market is cratering
Business majors suck.
Their GPU situation is weird. The gaming GPUs are good value, but I can't imagine Intel makes much money from them due to the relatively low volume yet relatively large die size compared to competitors (B580 has a die nearly the size of a 4070 despite being competing with the 4060). Plus they don't have a major foothold in the professional or compute markets.
I do hope they keep pushing in this area still, since some serious competition for NVIDIA would be great.
Sure, if by doing fine you mean looking alright in benchmarks while having zero supply because they don't make money selling them and thus don't want to produce them in any significant amount.
they always did, even back in college.
yeah quicksync is the only reason i put an intel in my NAS.
I'd probably just warranty the CPU and assume it was a defect instead of blame the entire company.
But yeah amd is the better choice for everything atm except x86 power efficiency laptop chips.
It was ok until he said the AMD chip consumed more power. It is a X3D chip that is pretty much a given, if he'd gone for a none X3D chip he'd have saved quite a bit of power especially at idle. Plus he seems to use an AMD chip like an Intel chip with little or no idea how to tweak its power usage down
Interesting, so it's not only their recent-ish (either 12th or 13th gen and up, iirc) laptop CPUs that die under normal load.
I knew Michael Stapelberg from other projects, but I just realized he is the author of the i3 Window Manager. Damn!
Just for interest. Why did you buy Intel in the first place. I don't know about many use cases where Intel is the superior option.
I'd never heard of arrow lake dying like raptor has been? wild.
The computer I bought should last me about 10 years. I spent a fuckload of money on it. The next comp will have to be done entirely with as little starting google and privacy violating shit as possible.
And I am certain AMD will make better stuff by then.
Somehow I figured out Intel was shit early on. Been AMD for like 15-20 years. I think it was a combo of childhood shit computers running Intel, and a lot of advice pointing out what garbage it was and not worth the cost for PC builds.
Similar reasons I hate Hitachi and Western Digital hard drives. They always fucking fail.
15-20 years is silly. Intel was the clear leader for a long time before Ryzen in 2017, and arguably a few years after that too.
I was in team AMD in the 2000s for two reasons: price and competition to Intel. Intel had a massive anti-trust loss to AMD around that time, and I wanted AMD to succeed. I stuck with them until Zen was actually competitive and stayed with them ever since because they actually had better products. Intel was the king in both performance and power efficiency until that Zen release, so I really don't know where that advice would've come from.
As for Hitachi and Western Digital, WTF? Hitachi hasn't been a thing for well over a decade since they sold their HDD business to WD, and WD is generally as reliable or better than its competition. It sounds like you were impacted by a couple failures (probably older drives?) and made a decision based on that. If you look at Backblaze stats, there's not a huge difference between manufacturers, just a few models that do way worse than the rest.
Similar reasons I hate Hitachi and Western Digital hard drives. They always fucking fail.
You misspelled Seagate.
My WD drives have been great, but my Seagates failed multiple times, causing data loss because I wasn't properly protecting myself.
All manufacturers have bad batches. Use diversity and keep backups.
Seagate has more than bad batches. When every single one of their 1tb per platter barracuda drives have high failure rates then that’s a design/long term production issue.
How likely is it that I got 4 to 5 bad batches over the space of as many years?
Raid and offline backups these days, I eventually learned my lesson. One of which is stay away from Seagate.
Within the realm of possibility. Especially if you treat them harshly (lots of start-stop, and low airflow and high temps). Backblaze collects and publishes data, and the AFR for Seagate is slightly higher than other manufacturers, but not what I'd consider dangerous.