Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
From videos I watched, the big issue is them losing their market position. They took a big hit when Apple ditched them and made their own chips. Now they're losing to AMD and Nvidia in the server space. Their newest desktop chips are under-performing. The consumer market is getting more competitive with Qualcomm joining the space and Nvidia/AMD preparing ARM chips. They made a lot of factories for producing chips but it sounds like they're struggling to lock in a major buyer. Now they're ejecting tens of thousands of employees in the next few months because they're hemorrhaging money.
TL;DR they're getting screwed from every front and either it will take them a long time to recover or they're going to be left behind.
I hope they cling on and make somewhat of a comeback, or carve out a niche market, but I don't feel sorry for them at all. The are guilty of shade monopolistic tactics.
As a nerdy consumer, I wouldn't count Intel out. I remember when their Pentium 4's ran hot and AMD started eating their lunch, then they launched the Core line up and were back on top. They get lazy when they're not challenged.
That being said, historically, they haven't done very well pivoting from their main business. Their GPU line up seems kind of ok but them trying to make mobile chips went nowhere.
Companies seem to have realized there's real benefit to using ARM processors in laptops for the performance and battery life which is a direct threat to intel's business.
So it's intel's ability to create when pressure is applied vs their inability to create products outside of their comfort zone.
I don't count them out but it's a steep climb.
I've got my eye on their stock just in case this looks like it might turn into something like Apple in the 90s.
It's going to take a long while for them to come back especially since they plan on laying off a huge amount of their engineers.
From what I've heard, the main thing modern Intel really excels at is hardware video encoding.
Sounds about right. I was a die hard fan of Intel for years. I upgraded my PC this year and I picked AMD for the first time in my life. Looking at the scathing reviews and performance tables, it is an insane choice to pick Intel.
The hit wasn’t Apple leaving them. That was a small part of their business. The failure was getting in on mobile when they had the chance. They could have diversified and they didn’t, so when AMD came to eat their lunch, they had no fallback and now way to catch up.
Apple probably didn't move the needle, at least in any market Intel was actually in.
Intel's deep woes began around 2016, when TSMC got ahead of them fab-wise and Intel stuck with in-house. Not a little ahead, years ahead and mostly a branding exercise to assert equivalence ("Intel 7" was just 10nm rebranded, and on the current 3nm front, TSMCs 3nm is over 50% more dense than Intel's claimed "Intel 3".).
At roughly the same time AMD did Zen, coming out of a long bad microarchitectural design.
Intel basically invested on trying to branch out in unproven directions rather than focusing on actually salvaging their core business. Intel partners were given huge budgets to try Intel's wacky ideas no one asked for and burdened Intel CPUs with trying to have a built in FPGA or HPC fabric or phase change memory sticks. They thought if they could make a rack of cpu sockets, memory, and I/O that could be freely reassociated they would have a gold mine, despite no one really wanting that (software does fine with traditional setup).
Then to just utterly drive things home, NVIDIA comes and every IT budget is busy throwing every last dollar they have at GPUs with as little as possible spared for enabling components, like CPUs.