The fact that I still get plenty of use out of a 13 year old laptop should answer that question. Computers are getting better but only incrementally. They're packing more transistors onto each die, but they're not really increasing the density by a significant margin, which is why the big stuff now requires water cooling. Regular CPU's have plateaued, GPU/NPU is a parlor trick of reducing functionality and packing more cores onto the die, but again, not at any increased density. Quantum computing is perpetually in a state of "almost there".
And I don't think most people need to care. There's no requirement for ever-increasing density. Hitting a wall could actually be beneficial.
Modern silicon is dead. Moore's law is broken - at least - from a traditional perspective.
But OTHER technology is advising exponentially every 18 months. So, it isn't REALLY broke. It just changed when we hit a ceiling.
The fact that I still get plenty of use out of a 13 year old laptop should answer that question. Computers are getting better but only incrementally. They're packing more transistors onto each die, but they're not really increasing the density by a significant margin, which is why the big stuff now requires water cooling. Regular CPU's have plateaued, GPU/NPU is a parlor trick of reducing functionality and packing more cores onto the die, but again, not at any increased density. Quantum computing is perpetually in a state of "almost there".
And I don't think most people need to care. There's no requirement for ever-increasing density. Hitting a wall could actually be beneficial.
But OTHER technology is advising exponentially every 18 months.
Is it, though?
All of the focus right now is in GPU and NPU density, but they aren't really advancing those, from a transistor density perspective, any faster than CPU has been advanced.
The whole point of GPU and NPU, after all, is that it's "really really RISC" in that the cores are specialized for high-density, low-complexity compute instead of general purpose compute. So they're packing as many of them onto the die as they can, but the process itself isn't advancing any faster than it is in the CPU world.
(This is my current understanding; if I'm full of shit please feel free to call me out here.)
All of the significant advancements have emerged from the idea someone had that a GPU, which was originally intended for graphics, can be used for any task that can be expressed as "reduce it to simple math and then parallelize the heck out of it". From a compute perspective, AI is just Bitcoin mining on a different data set, and Bitcoin mining is just CoD on a different data set. (Ok, that's *really* hyperbolic but you get the idea.)
Fundamental advances in manufacturing process could be equally applied to CPU and GPU.
I don't know - for sure. My last understanding goes back to when they FIRST claimed Moore's law was "broken," and other voices said, "Not really."
But it wasn't about GPUs back then - it was something ELSE that suddenly leapt into fill the void on Moore's law on silicon.
Basically, something ELSE was meeting the criteria - every 18 months. Then - people just stopped caring. At least, consumers. We've had plenty powerful enough CPUs for at least a decade to surf Facebook. Even a Pi 4 can do it. That is what average consumers care about.
But OTHER technology is advising exponentially every 18 months.
Is it, though?
Right, but Gordon Moore wasn't talking specifically about CPU or GPU or anything else, he was talking about fabrication process. His metric was that the number of transistors that could be packed into a particular area was doubling every 18 months. At the time, there were fewer metrics that affected computing speed and capacity, and that was one of the big ones.
Moore's Law ended quite some time ago. Although incremental advances in process have happened, the majority of the "bigger/faster" in computing over the last decade or so has come from parallelization (multiple cores on the die) and from massive parallelization (doing work on the GPU with thousands of cores).
For "regular" tasks, that's why a ten year old laptop is still useful now.
True. I know this - Intel doesn't - or didn't, around 2003 - consider themselves a technology company. They're a manufacturing company. They used to say, "We don't built technology, we build widgets - whatever we have to manufacture to stay in business, we will."
It didn't have to be Intel core processors. They're an industrial company.
Right, but Gordon Moore wasn't talking specifically about CPU or GPU or anything else, he was talking about fabrication process.
Wrong. I live in Chandler, about 5 miles from the Intel technology corridor - and I'm a former Intel employee.
They're not interested in consumer computing electronics at all anymore. The margins are so low, there are so many competitors - and as you mentioned, GPUs are where the real computing power is - that is where bitcoin mining happens, for example. The core processor doesn't really matter, does it?
But about 8 years ago - around the time they laid off EVERYONE I knew who still worked at Intel - literally, they were told, "the only people facing a layoff are those who have received a CAP (corrective action program) in the last 18 months," and then, EVERYONE got caps - those who had been there 24 years, on their 3rd sabbatical, always "exceeds expectations," on their reviews, suddenly got CAPs, and got laid off.
And all the tech companies along Price Ave. pulled out and moved. PayPal was the biggest, but there were dozens of others. And in moved Northrop Gruman and Boeing Lockheed.
And then Intel started construction on their Chandler FAB - which they had abandoned decades earlier. Now it is this dystopian looking mega city... but they mostly didn't build UP - they built out, but more than that, I went and looked... they built DOWN... 4 or 5 story holes DOWN in the desert.
So a couple related stories - I knew this Indian guy, Samir - he was an electronics engineer - he designed certain gates in Intel CPUs. Right... no one person builds the whole processor - but he built one corner of it... like building the corner on a master planned community. He worked a lot with Siemens. Anyhow - He told me WAY before the fact about how the NFL was installing cameras 360 degrees around all their stadiums, that would feed those videos into a database and use CGI in real time to be able to determine in real time if a player was in or out of bounds on a play - with a precision that couldn't be disputed. You could pan around, zoom, see EXACTLY where a foot was, where the ball was - if the call was good or bad. He told me I couldn't write about it - it was top secret stuff... 3 or 4 years later, was the first time I saw it live on the NFL - and it was EXACTLY as he described it.
He also told me about what they were doing with drone technology.
This isn't NFL tech - this is military-industrial complex technology. They're not a consumer electronics company anymore. They *hate* it when I speak out about this in public - but their official policy is to ignore it, because if they address it, it gives the claim credibility. That has ALWAYS been their policy about critical opinions of their business practices.
And that could be their exit strategy -- just being a fab for whoever and whatever dominates the market after x86 finally achieves the death it so desperately deserves.