on bad necrocomputing takes
Sometimes I see takes like "computers used to be able to do the same job using way less resources, software is shit nowadays"
And I can only wonder whether they've ever developed software back when real-time compositing was unaffordably expensive, and so was proper process isolation, and a million other 'modern conveniences' that make software, y'know, not suck
If we're limiting discussion to OS/firmware-level improvements like "process isolation" or "speculative execution mitigation" and calling them "modern conveniences" I might agree with you, but so many of the decisions that make up modern computing are based on "computers only get faster".
Can you honestly say last decade's fad of replacing HTML/CSS with single-page apps made the web suck less? How about replacing desktop apps with single-use web browsers that eat up 2GB of RAM apiece? Does the mere concept of needing to stand up a "language server" for _syntax highlighting_ not raise your hackles?
Fair enough I only started writing software professionally under a decade ago (and primarily in Python so I'm also a part of the problem), but that doesn't mean I can't see all the waste in the profession.