Follow

So here's a problem in computers that has been fascinaing me for a long time: "all that a computer can do" is obviously not generalizable into something that avoids the need for custom "low-level" programming work.

But the vast majority of real-world usecases - which basically boil down to a digital filing cabinet - *do* seem to be generalizable that way, to a limited set of high-level operations. So why do all attempts to do this, so far, suck?

· · Web · 0 · 0 · 1
Sign in to participate in the conversation
Pixietown

Small server part of the pixie.town infrastructure. Registration is closed.