Of course they're not going to admit if they were doing it. Denial absolutely doesn't mean guilt, as one poster said, but denial means
nothing. Maybe they did it, maybe they didn't. There's certainly reason to suspect foul behavior on their part due to their history, and while not as bad as this and what they pulled in years long past, they've done some questionable stuff in recent years as well, so it's not like they're completely changed from who they were years ago, just not quite as aggressive about it.
We can argue forever about what caused the stagnation over the past several years. Personally, I'm sure part of it was that they just had problems, but I also do believe a lot of it was due to lack of competition, so they felt no need to push forward, and I hate that attitude and think it's wrong. But on the other hand, as much as I hate Intel for doing it, is it wrong for a company to save money and increase profits by slowing things down when they can, and to simultaneously have something extra ready to go for when the competition resurfaces? I hate Intel in general, and it pisses me off that my last laptop was a slow POS because they didn't improve their chips more and that my current one isn't better than it is because of years of very slow progress, but I don't entirely fault them for it.
I think the "evidence" here is extremely weak and not even worth reporting on, but I also wouldn't be at all surprised if this did happen, and if it did I hope real evidence comes to light. But @_MT_ and @Russel are right, it could very well be a combination of AMD preferring to keep their systems in the budget range and OEMs being wary of investing heavily in AMD due to previous issues and lackluster performance and current shortages.
Quote from: _MT_ on January 21, 2021, 22:28:31
That's actually a philosophical question. The reality is that typical consumer applications can't use many cores. You can argue there is a chicken and the egg problem going on - applications won't be designed to take advantage of dozens of cores if there are no consumer processors with dozens of cores. Intel clearly wasn't interested in trying to change the status quo. Even today, it still hasn't really changed. It's actually quite a lot of work. And developers won't do it without a good reason.
I agree with another poster that shall remain nameless, though I'll be more civil. Not only are many programs these days capable of using multiple cores (I see it all the time when I watch CPU usage while doing various things), but even if 70% of the software a person uses is limited to a single core and 15% is limited to 2-4 cores, you still only have to do a few things before one app is running on one core, another is running on two, another on four, and another on six. In other words, just because
some of the apps you use might not take advantage, the whole of them very well might.