It seems like it would be trivial for them to reduce quality control and have customers just “deal with” chips that aren’t as stable. How come they aren’t doing this?
I don’t remember the details, but the Meltdown and Spectre vulnerabilities from a few years ago also felt like they were cutting corners. Those were enabled by some fundamental architectural decisions, which really just didn’t sound like a good idea to me, when I read up on them.
You don’t think they’re doing that as I’m writing this??
This was Intel’s 13000 and 14000 i7/i9 lines
First of all, they do do this. The AMD “tri-core” chip was a quad core that failed QA. Then most slower chips are in fact from faster chips but also failed QA and have to be underclocked.
However, the real answer is that because if you get a bit flipped, while most things are recoverable, there’s a good chunk at the hardware level that is unrecoverable. You’ve seen this with blue screens. Chips like that wouldn’t sell. Bluescreens and panics aren’t acceptable to people.
This is the correct answer right here. The process is called “chip binning.” https://www.techspot.com/article/2039-chip-binning/
Most of them are easily replaced. You won’t notice a difference if it’s an AMD, intel, seagate, kingston, etc that made your drive, ram, CPU.
Because they can be replaced very easily they can’t enshittify it. The tech giants that enshittify has made themselves very hard to replace in one way or another.
I remember when Intel made Pentium CPU’s that had a small math error in some very specific floating point calculations. They were so afraid to damage their reputation (which was still excellent at the time) that they offered every Pentium owner across the globe (including me) a free new Pentium CPU without the bug, shipped to us at their expense, and even sending out a courier to pick up the old CPU (again for free) a few weeks later when we had time to swap them. That was basically the opposite of what you’re suggesting.
More recently didn’t Intel fire some planet manager that was knowing shipping out CPUs with literal rust on vital components
The programming language? Yea, that’s a fireable offence.
/S
Isn’t that Intel’s MO?
Happens all the time… Back in the day if you had a chip with a failed math co-processor, you just sold it as a cheaper version without it. That kind of thing.
They kind of do, or at least used to.
If memory serves, they would take higher-end chips that didn’t pass QA for that product line, disable some cores or whatever, and sell it as a lower-end chip.
because it would tank their reputation and nobody would buy their chips.
you have to understand, chips are commodity items. they aren’t ‘sexy’ or marketed heavily. people don’t choose one chip brand over another based on how ‘sexy’ it is’. They mostly just don’t care as long as it works. technology is at this point is pretty much a commodity/utility for most people and most use cases apart from high end, low volume applications like PC gaming or etc.
the reason like a car manufacturer can do this is because the ‘image’ of the car is so appealing to people they will overlook how shit it is and how poorly it’s built. computer chips don’t have sex or image appeal in this way, so they really can’t afford to produce a shitty unreliable product.
consumer grade chips are already ‘low quality’ compared to the server end stuff anyway. they have cheaper tolerances and lack error correction and other features that are necessities for service-level computing. not to mention the really high end stuff you find in super computing and military/industrial hardware, etc.
It’s already common for chipmakers to disable cores entirely because of a single mistake. With 100 Billion+ transistors, even a doubling in error rate would 100fold the number of wasted dies.
Making the stencils is already expensive there’s no real way to cheap out on it.
Make games. Tabletop and video games.







