The fact that I still get plenty of use out of a 13 year old laptop should answer that question. Computers are getting better but only incrementally. They're packing more transistors onto each die, but they're not really increasing the density by a significant margin, which is why the big stuff now requires water cooling. Regular CPU's have plateaued, GPU/NPU is a parlor trick of reducing functionality and packing more cores onto the die, but again, not at any increased density. Quantum computing is perpetually in a state of "almost there".
And I don't think most people need to care. There's no requirement for ever-increasing density. Hitting a wall could actually be beneficial.