10-20 years ago it was fashionable in HPC research papers to assert that "Moore's Law is dead", and then equivocate to mean that clock frequencies and ultimately hardware performance of sequential code was frozen.
Platitudes! None of those things are strictly true. Sequential code runs *waaaay* faster on hardware from 2025 than from years previous.
A parallel I'm seeing in 2025 is to exclaim without evidence things like "Generative AI allows developers to be far more productive", or exclaiming "10x speed boosts", usually from people who don't write software.
Talk to folks who build production systems and you'll get a much more tepid response.
The emperor has no clothes??
I say this as someone who has been working directly in the modern ML space the past 4 years, and who regularly tries out new AI development tools.
I'm not seeing huge productivity gains or anything like 10x speed boosts. In fact, I often find that AI tools turn into a tarpit, making the development work take longer.
I don't have a good intuition for when an AI assistant can be helpful rather than a tarpit.