Pondering whether it would make sense to classify programming languages on a "precision" axis, meaning where it sits on the axis of tradeoffs between "exactly (needing to) specify all the details of what you want it to do, resulting in guaranteed and predictable behaviours" and "doing hopefully-the-right-thing with little specification work, at the cost of less predictable behaviour and it sometimes guessing wrong"
Also feels like this maps neatly onto "precision" in other disciplines (3D modelling, manufacturing, etc.) where working at higher precision gets you a more exact result but usually at increased cost elsewhere