Serious question: why would we want to create artificial intelligence?

And I mean the actual meaning of the term, not "a bunch of algorithms in a trenchcoat" and *definitely* not LLM grifts, but something that you could plausibly consider a form of genuine sentient life.

Why is this even a goal worth chasing? What does anyone hope to actually achieve with this?

@joepie91 AI never just meant AGI, but...

it would be a so much richer experience to be a mind with instant access to out computational tools. just spin ou a simulation for every mathematical/computational question you might have. they could see with a thousand eyes and act with a thousand hands. memories and experience cound be modular and sharable.

and their mind could have offsite backups. death could largely be history.

@joepie91 and it's just that the world (or society or whatever) is suffering terribly from a lack of intelligence. it's not like were making great use of the intellingence that's there, but if people were just be a little smarter, entire categories of problems would be bascially solved overnight. like religion and government. there would still be other, more interersting problems we'd have to deal with, we wound't be on the brink of self-extinction because of "who would build the roads".

Follow

@sofia Is all of this actually about *artificial* intelligence, though? Because it reads more to me like it is about augmentation of natural intelligence.

(Aside, "AGI" is a term that didn't exist until pretty recently, and it's what "AI" used to mean originally, also in modern AI research)

· · Web · 2 · 0 · 1

@joepie91 sure, it doesn't really matter how you get to it (though getting there in more than one way is probably desirable, because of robustness and diversity).

i feel like many of these things seem intuitively easier to implement in software, but i woudn't complain if we had, like, brain eingineeers working on superintelligence either. copying them for backups and sharing seems among the rather more tricky aspects.

@joepie91 i think in the early days AI folks tended to massively underestimate the complexity of seemingly easy thought processes. what their shiny new toys coud do just seemed way more impressive than recognizing a picture of a cat, etc.

the term "strong AI" and superintelligence are older. still i think if you asked AI folks back then they would say the things they make are AI, nor precursors or attempts at AI.

Sign in to participate in the conversation
Pixietown

Small server part of the pixie.town infrastructure. Registration is closed.