And even if computers are important for Moore's law development, there are still loony people buying a new computer every year because wackjob gamedevs shop execs make games more demanding for no reason. I am perfectly fine with my 3080 TI, which I upgraded from an 880 laptop, which I used for 8-10 years, but the average consumer isn't like me.
I suspect you're far more like the average consumer than you think.
I was reading some game board a few years back and some guy was going on about how the game didn't run well on his RTX 3080 (or whatever bleeding edge card he had) and he wrote something... I can't remember exactly, but he was clearly competely deluded as to what people were generally using. He seemed to think he had a fairly standard GFX, but the most common card on the Steam hardware survey at the time was something like a
1080 Ti. Fewer than 10% of gamers were on the latest generation GFX cards, and something like two-thirds of them were on cards two generations (or more!) earlier.
People like you and me probably spend something like $1000 every ~5 years on their main gaming rig, and will maybe do just one major upgrade (GFX most likely) in between, and that will be something cost-effective rather than top end. That is probably normal. Then there are the whales will probably buy a new rig at something like $3000 or more, and spend tons in-between on upgrades, peripherals, etc. They might be <10% of gamers, but they may account for something like half of gamer hardware revenue. Of course because they spend so much they get a lot of attention from advertisers, magazines, plus all the reviews and articles about the latest gear generally, and this probably tricks people into thinking these bleeding edge tech players are much more common than they are.
Game developers have already noticed that few people want to break the bank on hardware. Go back 20-25 years, devs were proudly talking about how if you didn't have the latest GFX card you could barely play their game, as a way of boasting how great their game was. Somewhere along the line, they noticed that all they were really doing was cutting their audience and sales. These days games are designed to facilitate decent performance on what may be old hardware, and a lot of the drive for constant upgrades has thus decreased, rather than increased.
* * *
But to sort of get back to the point, imagine that development of advanced computer hardware slowed. So Moore's law went from doubling every two years to doubling every five or ten. Who would really care?
The answer to that is "nobody".
Nobody would even really notice. It's not like there's some sort of parallel existence where this slowdown didn't occur that they could experience in order to realise. They'd still be enjoying new software, new this, new that. Human existence would still be on average infinitely better compared to 100 or 1000 years ago, and after another 100 years into the future post-slowdown they'd still be looking back thinking how much better the stuff they had now than 100 years ago.
What some people want to do is somehow step back and take some theoretical objection like "But it will take us 80 years to get a manned flight to Mars instead of 50!" The answer to that is that
it doesn't really matter whether it takes 50 years or 80. It's a nerd complaint: the sort of mindset that has deified tech and seems to put the development of tech ahead of human experience. Like, we should colonise Mars because that'd be so fucking cool, with no consideration of whether doing so is any fucking use to make the lives of humanity any better, and how more usefully we could have spent those resouces on make people's lives better.